Frozen timestamps

Frozen timestamps


A century and a half ago, when trains were plying but electricity was still not widely available, intrepid entrepreneurs cut huge chunks of ice from the frozen Great Lakes and transported them to California and Texas to cool drinks in summer.

Decades later, scientists developed a method of collecting ‘ice cores’, cylindrical chunks of ice, from different depths below the surface to study what lay trapped in them and, in turn, decipher the conditions that prevailed during that period. Air bubbles trapped in ice are really books of history.

Now researchers of the British Antarctic Survey are on a project to study ice cores 3 km below the Antarctica plateau to determine the state in which the continent existed 1.5 million years ago. This takes research further back in history, building upon an earlier research that looked at the continent’s climate record 800,000 years ago.

The drilling site, Little Dome C, is about 40 km from the French-operated Concordia Station.

Dr Liz Thomas, Head of the Ice Cores team at the British Antarctic Survey, seeks to unlock the answer to why, a million years ago, the gap between two glacial cycles expanded from 41,000 years to 100,000 years.

With the data collected from the cores, scientists will reconstruct how the environment was back then — temperatures, wind patterns, extent of sea ice, and so o.

“This unprecedented ice core dataset will provide vital insights into the link between atmospheric CO₂ levels and climate during a previously uncharted period in Earth’s history, offering valuable context for predicting future climate change,” Dr Thomas says in a statement.

More Like This

AUTOMATED ANSWERS: Ramprakash Ramamoorthy, Director of AI Research, Zoho

Published on July 28, 2025



Source link

PARAM-1 learns local ways

PARAM-1 learns local ways


In the rapidly expanding world of large language models (LLMs), English continues to dominate, throwing other languages in the shadow. This imbalance is particularly stark in India, where more than 20 official languages and hundreds of dialects are spoken daily. PARAM-1, a newly released bilingual foundation model, rises out of India’s own linguistic and cultural landscape.

The model is detailed in a paper published on arXiv (July 2025) by the BharatGen team, which includes Kundeshwar Pundalik, Piyush Sawarkar, Nihar Sahoo, and Abhishek Shinde. The authors describe PARAM-1 as a 2.9-billion parameter foundation model trained from the ground up to reflect Indian realities.

Beyond translation

The name PARAM has a legacy in Indian high-performance computing, but the new model signals a different ambition. PARAM-1 is not a simple upgrade of past systems; it is designed to create artificial intelligence that understands India as more than just another market.

Unlike most global models that treat Indian languages as peripheral, PARAM-1 dedicates 25 per cent of its training data to Hindi. This includes government translations, literary works, educational material and community-generated content. The rest of the dataset consists of English sources carefully curated for their factual depth and range.

A tokeniser is the first step in how a language model processes text. It breaks sentences into smaller units, or tokens, which the model can interpret.

Standard tokenisers, built for English, perform poorly on Indian scripts, splitting words into too many fragments. PARAM-1 addresses this with a script-aware tokeniser that recognises Hindi and other Indic scripts, creating fewer and more meaningful tokens. This improves both accuracy and efficiency.

Although PARAM-1 currently supports only English and Hindi, its tokeniser has been designed for broader Indian linguistic diversity. It can handle scripts such as Tamil, Telugu, Marathi and Bengali, laying the groundwork for future multilingual expansion.

Design, not retrofit

PARAM-1 is the result of a training strategy that prioritised inclusion from the start. It was trained in three phases, beginning with general language learning, followed by a focus on factual consistency, and, finally, long-context understanding. This structure allowed the model to gradually develop fluency, retain factual information more effectively, and improve performance on tasks that require reading and reasoning over longer texts.

The model was tested not just on widely used English-language benchmarks such as MMLU and ARC Challenge, but also Indian-specific datasets. These included MILU, which draws on Indian competitive examinations, and SANSKRITI, a benchmark that covers cultural knowledge ranging from festivals to geography. The results were encouraging. PARAM-1 performed competitively on global benchmarks and outperformed several open models on Indian tasks, especially in Hindi.

More languages

Although PARAM-1 is presented as a model designed for India, its bilingual focus means that other Indian languages are still excluded. This raises questions over the model’s inclusivity, especially in a country where linguistic identity often intersects with regional politics and access to services.

The team behind PARAM-1 appears to be aware of this limitation. The tokeniser was specifically engineered to handle the morphological patterns found in Indian languages beyond Hindi. While this does not compensate for the lack of direct training in those languages, it does provide a foundation for expanding the model’s linguistic reach in future iterations.

Equitable AI

PARAM-1 is not a frontier-scale model, nor does it claim to be the most powerful LLM available. Its significance lies in a different direction. It shows what can happen when the design of an AI model reflects the needs and complexities of the people who are meant to use it.

The development of PARAM-1 offers a blueprint for equitable AI design. It highlights the importance of investing early in diverse data, language-aware infrastructure, and public benchmarks that reflect regional and cultural realities. The model also invites broader participation from government agencies, universities, and private firms, especially if it is to grow into a truly multilingual and domain-specialised platform.

The authors of the model offer a clear message in their conclusion: Fairness in AI cannot be treated as an afterthought. It must be addressed in the earliest stages of design. PARAM-1 currently supports just two languages, but leaves the door open for many more. It serves as a reminder that if artificial intelligence is to serve all of humanity, it must begin by learning to listen to more of it.

More Like This

Published on July 28, 2025



Source link

‘Without a mature digital core, firms cannot monetise AI’

‘Without a mature digital core, firms cannot monetise AI’


AUTOMATED ANSWERS: Ramprakash Ramamoorthy, Director of AI Research, Zoho

As an intern at software major Zoho in 2011, Ramprakash Ramamoorthy had worked with teams that were still figuring out whether machine learning could be integrated into the firm’s suite of products. Sentiment analysis, anomaly detection, recommendation engines — these were his first few brushes with the working of algorithms, before he surfed the Alexa and Siri waves in 2018, and the Chat GPT wave of 2022.

Recently, Zoho launched three large learning models and a speech recognition model (in English and Hindi, with 15 more regional languages to follow). On the sidelines of the launch of Zoho’s platform for building ‘agents’ — autonomous software systems — Ramamoorthy, now Director of AI Research, told businessline that the focus remains firmly on offering value to customers and securing their privacy.

Edited excerpts from the chat:

As AI director, what is your mandate?

We have a hub-and-spoke model for AI development. I am a part of a group called Zoho Labs, where we take care of the foundational technology; then there are 55-plus product teams, which build on top of the foundation we provide.

What does Zoho Labs do?

We’re about 200 people in the team, distributed across Nagpur, Tenkasi, Chennai and other locations. We also have a five-member team in Mexico for our Spanish initiatives.

We have teams that work on databases. Then there’s a hardware acceleration team. Last year, we announced our partnership with Nvidia… their work goes towards the technical foundation. We have a hardware team that works with AMD, Intel and others. We think AI hardware is super-important to ride the AI/ML wave.

Where does Zoho stand in the current agentic AI wave?

We have 25-plus pre-built agents, some India-specific agents for Aadhaar verification and so on. In agent studio, you can prompt and build your own agents. Our MCP [model context protocol] server connects models like GPT or Claude to Zoho’s APIs, data models, and actions.

How will AI change enterprise software?

A lot of it will become prompt-driven, with users not needing to learn to use it. Enterprises are overwhelmed… none have got an ROI from their existing AI stack. So, the first thing is to get your digital maturity right.

Do all firms need agentic AI?

Wherever there’s repetitive workflow, agents can add value. With our footed agents we saw 10-30 per cent productivity, but they cannot replace my support team. People talk about 10X, but we haven’t seen that. Companies must first find out what can be automated and remove the data silos for smooth flow of data. Yes, agents will be important, but they cannot replace humans… a strong digital foundation should be at the core of agentic AI usage.

More Like This

Published on July 28, 2025



Source link

Antarctic Ocean’s briny puzzle

Antarctic Ocean’s briny puzzle


Something strange is happening in the Antarctic Ocean, which has scientists baffled. They have some theories but have been unable to nail down the cause of the problem.

The Antarctic Ocean’s surface waters have been turning salty since 2015.

Normally, ice melts in summer. The meltwater forms a layer on the surface, floating over denser saltwater below — a phenomenon known as ‘stratification’. The floating freshwater acts like a lid, preventing warmer saltwater from rising to the surface.

In winter, the freshwater would freeze again — but less and less due to global warming.

Since 2015, for reasons not well understood, the stratification is weakening, allowing more subsurface saltwater to mix with the freshwater, turning the surface water saltier.

This affects ice formation in winter — loss of cryosphere.

A group of researchers from the University of Southampton, UK, used satellite images to study ‘salinity signatures’.

They note that the rapid changes observed over the past decade contradict the conventional wisdom that global warming drives up the volumes of surface freshwater.

“This suggests that current understanding and observations may be insufficient to accurately predict future changes,” they say in a paper published in PNAS, suggesting closer monitoring.

Caroline Holmes, a polar researcher at British Antarctic Survey, pointed out to Livescience.com that the Southern Ocean below the surface is “chronically underobserved.”

More Like This

Published on July 14, 2025



Source link

Adding salt to a heating solution

Adding salt to a heating solution


Store heat energy for months on end and draw it out when needed in the really cold months. Sounds far-fetched? Not to Dr Sandip Saha, Dr Chandramouli Subramaniam and Dr Rudrodip Majumdar, who have developed a prototype device using a salt — strontium bromide — to demonstrate that heat can be stored for long periods, much like the gas cylinders we use at home.

In cold climes, especially in the Himalayan north, wood is predominantly used as fuel for heating. Diesel as fuel is not only a pollutant but also scarcely available in these regions.

So, what is the alternative? Enter strontium bromide. “Strontium bromide stores heat much like a battery stores electric energy,” Saha, Professor, IIT-Bombay, Department of Mechanical Engineering, told businessline.

Dr Rudrodip Majumdar, Associate Professor, National Institute of Advanced Studies

Thanks to its high energy density, chemical stability, non-toxicity, non-explosive nature, and environmental safety, the salt lends itself to use here, he says.

Prototype design

The team developed a prototype featuring solar thermal air collectors, which use sunlight to heat air during summer. The hot air is then used to warm a form of hydrated strontium bromide (hexahydrate). In this form, the strontium bromide crystals contain water molecules within their structure.

The salt absorbs heat energy as it undergoes a dehydration reaction — namely removal of water from the salt. “This reaction helps store the absorbed solar energy as chemical potential in the salt,” says Saha.

Now, to get heat out of the salt, all you have to do is reverse the process — pass moist air through it. In other words, when the salt is ‘rehydrated’, it releases the stored heat.

Apart from the solar thermal collectors, the device consists of a reactor chamber filled with strontium bromide, and a small air circulation system for the dehydration and rehydration cycles. The set-up is encased in a weatherproof unit designed for Himalayan conditions and is insulated using glass wool.

The storage module, points out Saha, “is about the size of two LPG cylinders we use at home. The heating can be done in sunnier regions such as Gujarat or Rajasthan and the dehydrated salt can be carted up the hills just before winter sets in”. For household use, the system would primarily consist of a reactor unit (containing the salt-silica gel), a blower, and a small control system. The heating solution lends itself to spaces of about 100 sq ft.

Cost-efficiency

The unit does not require high maintenance, the researchers say. The prototype has been used to demonstrate six charging and discharging cycles with no slip in performance. Salts such as strontium bromide are theoretically capable of about 600 cycles.

How do the costs compare with using diesel for heating? “Electricity from diesel costs us ₹50 per unit (kWh),” says Majumdar, currently Associate Professor at the National Institute of Advanced Studies. “If we add a carbon penalty, it could go up to ₹78 per unit. The thermochemical solution is expected to come at half the price.”

According to an article on the IIT-Bombay website, the study determined the thermochemical systems’ ‘levelised cost of heating’ (LCOH) — the average cost of producing usable heat over the lifetime of a heating system — to be ₹33–51 per kWh in different Himalayan cities. This makes it competitive with, or cheaper than diesel heating for daily use, especially when factoring in environmental costs. In Leh, specifically, LCOH dropped to ₹31 per kWh, the lowest among all the locations studied (Darjeeling, Shillong, Dehradun, Shimla, Jammu, Srinagar, and Manali being the rest).

Reinforcement

But the setup did come with its own problems initially. Like common table salt, strontium bromide, too, readily absorbs moisture from the air. If exposed to excessive humidity, the salt can liquefy, making it useless for repeated cycles.

Saha and his team found that mixing strontium bromide with silica gel helped in two ways: it absorbs extra moisture from the air during the hydration (discharging) phase, preventing the strontium bromide from dissolving; and, as the salt itself is not very strong structurally, silica gel provides support, allowing the salt to withstand the hundreds of cycles without degrading.

The optimal mix is 75 per cent strontium bromide and 25 per cent silica gel. This enables the salt mixture to be recycled for extended periods, potentially 8-9 years.

But why salt? Why not use solar power for heating?

Says Majumdar, “In the case of solar power for heating, batteries would be necessary to store energy captured during the day, for use at night. In colder climes, the state of charge for the batteries degrades faster and the chemical reactions that generate electricity within the battery slow down, reducing the battery’s efficiency.”

In contrast, he says, the salt-based system deals with only heat energy, without conversion to other forms. Therefore, intermediate losses, as seen with energy conversion, are avoided. Solar panels typically convert sunlight to electricity with an efficiency range of 15-22 per cent for commercially available panels.

Further, Majumdar points to the environmental impact: “While places like Ladakh have good ‘direct normal irradiance’ (DNI) for solar charging during summer, the aim is to avoid adding construction activity to already vulnerable areas. Transportation of the salt modules via existing supply routes (such as food grains or army supplies) makes it a viable solution.”

More Like This

Published on July 14, 2025



Source link

Photonic radars — reality, hype and beyond

Photonic radars — reality, hype and beyond


Conventional radars struggle to generate high-frequency signals beyond 40 GH
| Photo Credit:
enot-poloskun

On June 29, the Defence Research and Development Organisation (DRDO) announced that it had developed a photonic radar system and was readying it for trials later this year. India will likely become the fourth country, after the US, China and Israel, to induct these radars.

While conventional radars use electronic devices (oscillators) to generate radio frequency (RF) signals, photonic radars combine two laser beams of slightly different frequencies (optical heterodyning) to generate, process, and analyse RF signals. Conventional radars find it difficult to generate high-frequency signals beyond 40 GHz. Photonic integrated circuits (PIC) can generate RF signals starting from 100 GHz and all the way to terahertz. This provides many advantages, as spelt out by DRDO.

The photonic radar’s ability to detect is significantly higher. It can, for example, call out an incoming hypersonic missile.

It generates purer signals — less of the ‘noise’ that emanates from the heat generated by electronic components — leading to sharper detection of the ‘echo’ from the target. Moreover, photonic radars can generate high bandwidth signals — the higher the bandwidth, the greater the resolution. In simple terms, you not only detect the target well, but you also ‘see’ it better.

Further, photonic radars are highly jam-resistant — photonic components are practically immune to electromagnetic jamming. Jammers send a lot of ‘noise’ or fake signals to confuse the radar, but photonic radars are not fooled. First of all, jammers typically do not send high-frequency signals. More importantly, photonic radars are capable of ‘frequency hopping’ — they keep changing their frequencies, which confuses the jammers.

Finally, photonic components do not have copper and are, hence, lighter. This is an ace up the sleeve. Imagine fitting these smaller, lighter radar systems in satellites, swarms of drones and fighter jets!

Now, if you pair photonic radars with gallium nitride (GaN) semiconductors, you will have a radar that is potent — GaN semiconductors can amplify signals efficiently, as explained in ‘Stealth technology: To see and not be seen’ (Quantum dated June 15, 2025), allowing them to travel farther and return stronger echoes.

DRDO has a working prototype, which means ‘technological readiness level’ of 6. That is heartening.

Swiping away hype

While DRDO has been measured in its announcement, social media comments show that many Indians are kvelling at the news. However, some reality checks are needed. One big challenge for India is in gaining access to PICs, since the country is not equipped to fabricate them. Moreover, ‘photonics’ call for special materials, mainly indium phosphide and silicon photonics, which are not easily available.

India will have to design the circuits and get them fabbed elsewhere — but where? The US has export restrictions. Accessing from China is, of course, out of the question. Likewise, other components like tunable lasers and modulators are tough to get. So, ‘from the working prototype to industry’ is not a short hop, but a giant leap.

Edging ahead

Photonic radars are cutting edge; other countries are honing the edge. India, with its development of photonic radars, is not leading but catching up. In these emerging technologies, no country is much ahead of the start-line. India has an opportunity to lead.

There are at least two radars on the tech treadmill that promise to be better than photonic radars.

One is the quantum radar. As the name suggests, it uses quantum technology for detection and imaging. As a 2019 article in MIT Technology Review says, the US has made some progress.

At the core of this technology is the production of a pair of entangled photons — sending one to the target and then comparing it on reflection with the second photon; the difference will tell the target’s tale: its location and how fast it is moving. This is high physics. By the looks of it, quantum radars are a long way away.

The second potential technology is the ‘terahertz radar’, which operates in the electromagnetic spectrum between microwaves and infrared light, typically 0.1-10 THz — called the ‘terahertz gap’ — where the signal oscillates a trillion times a second. The corresponding wavelength is about 0.3 mm.

Terahertz technology is not as recondite as quantum radar — quite a few countries have made some progress, though there is no record of a military deployment. The good news is that India is also in the game. The Ultrafast Terahertz Spectroscopy and Photonics Lab at the Jawaharlal Nehru Centre for Advanced Scientific Research, Bengaluru, the Terahertz Communication and Sensing Group at IIT-Roorkee, and the University of Hyderabad are keeping India in the reckoning.

More Like This

Published on July 14, 2025



Source link