State Vs. Plate: India’s Long History of Food and Consumption Restrictions

Prime Minister Narendra Modi’s recent appeal asking citizens to avoid unnecessary foreign travel, cut discretionary fuel use, postpone non-essential purchases, and embrace restraint in consumption has revived an old Indian political tradition: the call for austerity in moments of uncertainty. The appeal, made in the context of global tensions and fears over fuel and supply disruptions linked to West Asia, was framed as a precautionary economic measure.

By no means is this new. India has heard such calls before.

Since Independence, governments across political ideologies — Congress, socialist coalitions, Janata regimes, and even regional administrations — have periodically attempted to regulate what people eat, how much they consume, how lavishly they celebrate, and even how many guests they invite. Sometimes these restrictions emerged from genuine shortages. Sometimes they reflected wartime economies. Sometimes they were moral projects tied to ideas of discipline, simplicity, Gandhian restraint, or anti-elitism. And at times, they became deeply political.

The Era of Scarcity: Rationing and Food Controls

In the first decades after Independence, India was a food-deficit country. Grain shortages, droughts, foreign exchange crises, and dependence on imports shaped policy thinking.

The ration card became one of the defining documents of Indian life. Urban Indians especially grew up in a world of controlled sugar, kerosene, rice, and wheat distribution. The Essential Commodities Act of 1955 empowered governments to regulate production, storage, transport, and distribution of key goods.

This was not merely administrative economics; it shaped everyday culture.

Families planned meals around availability. Weddings became simpler in drought years. Restaurants faced restrictions on serving certain foods. In many states, governments imposed “rice control orders” limiting movement and stocking of grains. Hoarding and black marketing became criminal offences.

During severe shortages in the 1960s, Prime Minister Lal Bahadur Shastri famously urged Indians to skip one meal a week. “Monday fasts” became a patriotic exercise in many households. Restaurants in several cities reportedly shut on Monday evenings in support of the campaign.

The symbolism mattered as much as the economics. Food restraint was projected as national duty.

Wedding Restrictions: When the State Counted Your Guests

Perhaps the most striking example of state intervention in private consumption came through attempts to regulate weddings.

India’s long-standing anxiety over “wasteful expenditure” often found weddings at the centre of policy debates. Lavish feasts were criticised not only as economic excess but also as drivers of social inequality.

The most famous attempt came during the Emergency (1975–77), when the government imposed restrictions on the number of guests and dishes at weddings in several places. Though implementation varied across states and districts, stories abound of officials inspecting marriage halls and counting attendees.

Even outside the Emergency, states periodically experimented with controls:

  • Limits on the number of dishes served.
  • Restrictions on use of electricity and lighting during shortages.
  • Curbs on late-night celebrations.
  • Controls on loudspeakers.
  • Taxes or permissions for large gatherings.

In the 1970s and 1980s, “simple marriage” campaigns were encouraged by politicians and social reformers alike. Government employees in some sectors were informally encouraged to avoid extravagant ceremonies.

These debates continue today in different forms. Environmental concerns, food wastage, traffic congestion, and conspicuous consumption have all entered the conversation.

Ironically, Indian weddings evolved in the opposite direction. Liberalisation in the 1990s transformed them into giant economic ecosystems involving tourism, fashion, catering, décor, jewellery, entertainment, and destination hospitality.

The “Big Fat Indian Wedding” became both aspiration and industry.

Meatless Days and Regulating Food Habits

Food restrictions in India have rarely been only about economics. They are also about morality, religion, and identity.

Across decades, many Indian cities and states have periodically imposed bans on slaughter or meat sales during religious festivals.

These restrictions reveal a deeper Indian tension: food is intensely personal, but also intensely political.

The Anti-Waste Moral Economy

A recurring theme across Indian public life is the suspicion of conspicuous consumption.

During crises — wars, droughts, inflationary periods, oil shocks — governments often invoke the language of sacrifice.

In the 1970s oil crisis, many countries experimented with fuel-saving measures. India too promoted conservation campaigns. More recently, during COVID-19 lockdowns, public messaging encouraged minimal movement, reduced fuel consumption, and simplified social ceremonies.

The latest appeal by the Prime Minister fits into this long tradition. Public reactions, unsurprisingly, have been mixed.

On social media, some users compared the moment to earlier periods of austerity and wartime discipline, while others questioned whether ordinary citizens should bear the burden of global crises. Online discussions reflected anxieties about fuel prices, inflation, work culture, and economic uncertainty. (reddit.com)

This duality is very Indian.

The State vs The Plate

India’s relationship with food control differs from many Western democracies because food here is not merely nutrition or commerce. It intersects with caste, religion, region, language, ecology, and politics.

At the same time, the Indian state has historically justified intervention using three broad arguments:

  1. Scarcity management.
  2. Social reform.
  3. Public morality.

The justification changes with the era.

In the 1960s it was famine anxiety. In the 1970s it was socialism and anti-elitism. In the 1990s it became public order and urban governance. Today it is often linked to sustainability, nationalism, health, or cultural identity.

What Restrictions Reveal About India

Food and guest restrictions may appear trivial compared to constitutional politics or macroeconomics. Yet they reveal something fundamental about India.

They show how deeply the state has historically engaged with everyday life.

They also reveal a persistent belief among governments that national crises require behavioural change from citizens — not just policy change from institutions.

Whether it was Shastri’s appeal to skip meals, Emergency-era guest limits, anti-hoarding drives, meat bans during festivals, or recent calls to reduce consumption amid global uncertainty, the underlying message has remained similar:

Private behaviour is seen as part of public national discipline.

India may change governments, ideologies, and economic models — but the debate over what citizens should eat, spend, serve, celebrate, or conserve never quite disappears.

A crisis leads to innovation and change. If this debate can lead to a re-think on the obscenely lavish weddings which have become the norm, it may be one of the good things to come out of this situation

–Meena.

Pic: BBC

Mother’s Day: From Concept to Commerce

As the countdown begins, and the hype builds up to Mothers’ Day, cards, gifts and flower sellers, and restaurants look forward to a bonanza. Yet another day, among at least five other “days” that now mark every one of the 365 days of the year. While these days are created, in many cases, to commemorate an event or person, or to raise awareness about a cause, they have also become lucrative occasions for marketing memorabilia.

Most of us have never thought about Mothers’ Day beyond debating over what to gift Mom. In fact, my generation does not remember celebrating such a day at all. We assumed that it was a relatively new concept. Well, surprise, surprise! The history of this day dates back over a century, and ironically, it was started with completely different objectives.

The story goes way back to the early 19th century and an ordinary working class woman Ann Maria Reeves Jarvis who lived in the Appalachian area of West Virginia in the United States. Ann Maria bore more than a dozen children but, as was common in those times, lost most of them to childhood diseases like diphtheria and measles. While most families took childhood mortality as a will of God, Ann Maria felt that a major factor was the unhealthy and unsanitary conditions amidst which the community lived. She felt that it was important that families, and especially women, were made aware of this.  As an active member of the local Methodist Episcopal Church, she organized Mothers’ Work Clubs where she raised awareness about hygiene and sanitation, and the vital importance of boiling drinking water. The church promoted special Mothers’ Work Days when women would work together to collect trash, and undertake other projects to improve local environmental conditions. The organisers provided medicine and supplies to sick families, and when necessary, quarantined entire households to prevent epidemics.   

The American Civil War that began in 1861 changed the focus on Ann Maria’s work. She organised women’s groups to help soldiers from both sides who were sick or wounded. She worked to promote peace and unity. In 1868, despite threats of violence, she organized a Mothers’ Friendship Day to bring families of both sides together to restore a sense of community. She strongly believed that women, and especially mothers, were best suited to bring people together with a goal of peace.

Thus Ann Maria Jarvis spent her life mobilising women to work for improving the lives of their children. She fervently hoped that the vital work of mothers was recognized. She once wished “I hope and pray that someone, sometime, will found a memorial mother’s day commemorating her for the matchless service she renders to humanity in every field of life. She is entitled to it”.

Ann Maria Jarvis died in 1905. Her daughter Anna Jarvis set out to make her mother’s dream a reality by designing a Mother’s Day celebration in honour of her mother. She chose the second Sunday in May to mark the anniversary of her mother’s death. Ann Jarvis herself never married and had children, but she viewed motherhood simply through the eyes of a daughter. Thus she constructed a child-centered celebration of motherhood for Mother’s Day: a “thank-offering” from sons and daughters and the nation “for the blessing of good homes.” She requested children to visit or write letters home on this day. She chose her mother’s favourite flower, the white carnation, as an emblem for this day.

The first Mother’s Day celebration was held on 10 May 1908 in Andrews Methodist Church in Grafton, Ann’s hometown. Anna handed out hundreds of white carnations to the mothers who attended. After that Anna lobbied to get official recognition for this day. The day was granted federal recognition by President Wilson in 1914, just before the start of World War I.

While she succeeded in getting Mother’s Day adopted as national holiday, Anna Jarvis saw the concept as her intellectual and legal property, and not as part of the public domain. She wished for Mother’s Day to remain a “holy day,” to remind us of our neglect of “the mother of quiet grace” who put the needs of her children before her own.

But the market realized the commercial appeal of a sentimental celebration of motherhood, combined with the story of a daughter’s story of memorialization. The appeal to write letters home fueled a huge boom in the greeting card industry, and the white carnation gesture bloomed into a thriving market for all flowers. By the early twentieth century, it had become yet another observance that turned into a “burdensome, wasteful, expensive day”.  

Anna Jarvis was appalled by the commercialization. She publicly denounced all such gimmicks, and registered her protest in many ways in different forums—from boycotts to gate crashing conventions. She spent the rest of her life fighting, and spending all her money on opposing what she thought was the distortion of the original sentiment of the day, and the crass profiteering that this resulted in. She also struggled for a copyright on the phrase “Second Sunday in May, Mother’s Day”. She insisted that it was Mother’s Day and not Mothers’ Day as was being marketed, and which was also a ploy to counter her copyright claims.

Anna could have easily profited from the day by claiming royalties from the sales of cards and flowers, but she remained firm in her opposition to the commercialization of her original sentiment. She and her sister survived on the small inheritance from their father. She was so distraught at the way the commemoration morphed into a commercial extravaganza, she even started a petition in 1943 to have the national holiday recalled. She spent her last days in a sanatorium in Philadelphia, and died of heart failure in 1948.

This year as we plan special treats for our mothers (and they certainly deserve those!) let us also remember how it all began, and Anna Jarvis who regretted creating this day! And let it not be only a once-a-year gesture. We can show we care and love, in more ways than one. 

Happy Mother’s Day!

–Mamata

The First Electrical Voting Machine

With election-fever and results-fever just abating, one the of the topics of discussion has of course been the controversial EVM—Electronic Voting Machine.

Where did it all start? Surprisingly with Edison—yes he of the light bulb fame.

In 1869, Thomas Edison patented what is widely regarded as the first electrical voting machine—an invention designed to automate and speed up vote counting. This was his very first patent–U.S. Patent 90,646 granted on June 1, 1869. It was s designed to allow legislators to vote “yes” or “no” using a switch that sent signals to a central board,

Edison’s vote recorder was technically sound. By all accounts, it worked exactly as intended. But when he demonstrated it to legislators in Washington, D.C., they turned it down. Not because it was flawed—but because it was too efficient.

At the time, voting in legislatures was a slow, deliberate process. Delays were not bugs; they were features. They allowed for persuasion, negotiation, and, frankly, political maneuvering. A machine that eliminated delay also eliminated strategy. Edison would later reflect that this rejection taught him a lasting lesson: invent only what people are ready to use. The vote recorder’s rejection wasn’t about engineering; it was about human systems resisting change.

Edison’s Curious Patent Portfolio

Edison went on to file over a thousand patents, many of them transformative, some delightfully obscure. Alongside world-changing inventions like the incandescent light bulb and the phonograph, there were also lesser-known creations: an electric pen for duplicating documents, a system for preserving fruit, even ideas for concrete furniture.

Some succeeded because they met an immediate need. Others failed because the ecosystem—technological, social, or economic—wasn’t ready. But probably sowed the seeds for many a current-day device.

The Slow March Toward Voting Machines

Despite Edison’s early setback, the idea of mechanizing voting didn’t disappear. By the late 19th and early 20th centuries, mechanical voting machines began appearing in the United States. These lever-based systems aimed to reduce fraud and standardize ballot counting.

Over time, technology evolved. Punch-card systems—infamously remembered from the 2000 United States presidential election, introduced new efficiencies, along with new vulnerabilities. Hanging ‘chads’ became part of vocabulary, illustrating how even small technical flaws could undermine trust.

By the late 20th and early 21st centuries, electronic voting machines (EVMs) emerged as the next step. These ranged from Direct Recording Electronic (DRE) systems to optical scan ballots and, more recently, hybrid systems with paper audit trails.

Trust, Technology, and Tension

Electronic voting systems around the world have sparked debate. Critics raise concerns about hacking, lack of transparency, and the difficulty of verifying results independently. Supporters counter that well-designed systems are more accurate and less prone to human error than paper ballots.

Countries have taken different paths. While Brazil has widely adopted electronic voting, others like Germany have rolled back its use, citing constitutional concerns about transparency. The Netherlands and Ireland have also stepped away from electronic systems after public and political pushback.

Even within the United States, practices vary widely by state, reflecting a broader unease about balancing efficiency with trust.

India and the EVM

Few countries have embraced electronic voting as extensively as India. Introduced on a large scale by the Election Commission of India, EVMs were designed to tackle logistical challenges: vast electorates, difficult terrains, and the need for rapid, reliable counting.

Indian EVMs are standalone devices, not connected to the internet, which proponents argue makes them more secure. The addition of VVPAT (Voter Verifiable Paper Audit Trail) systems is supposed to strengthen transparency by allowing voters to confirm their choices.

And yet, controversies and fears persist.

The Real Lesson: Technology Isn’t Neutral

Edison’s failed vote recorder reminds us of something we often forget: technology does not exist in a vacuum. It interacts with human behaviour, institutional norms, and political incentives.

Voting, perhaps more than any other civic act, depends not just on accuracy but on perceived legitimacy. A system can be technically flawless and still fail if people don’t trust it. The technical aspects may work fine, but is it still corruptible when the system itself is corrupt?

In that sense, the story comes full circle.

Edison built a machine to make voting faster. Lawmakers rejected it because speed threatened the very nature of their process. More than a century later, we are still grappling with the same tension—between efficiency and trust, innovation and acceptance.

The question is no longer whether we can build better voting machines. It is whether societies are ready to believe in them.

And if Edison were around today, he might recognise the problem instantly.

–Meena

Pic: https://edison.rutgers.edu/life-of-edison/inventions

When Prizes Change the World: What Innovation Contests Teach Us (and Why India Should Care)

Most prizes are given to those who have already changed the world: the Nobel, the Magsaysay, any number of national recognitions. These prizes are ways in which the world recognizes a lifetime’s work, a breakthrough discovery, timeless writing, selfless humanitarian aid. The awards in these instances are however collateral benefits. For these greats, often the wok is their own reward.

But in some cases, the prize itself is the motivator, it is the way to spur developments to change the world. Some of the most transformative technologies in human history were sparked by something deceptively simple: a prize.

A problem recognized. A deadline set for its solution. A reward announced for the solution.

And then—an open invitation to anyone bold enough to try.

Take the British Parliament’s Longitude Prize of 1714. Navigation at sea was perilous because sailors could not accurately determine longitude. The reward on offer was up to £20,000—an astronomical sum at the time. The solution did not come from a celebrated astronomer, but from a self-taught clockmaker, John Harrison. His marine chronometer worked—but recognition did not come easily. Payments were staggered, disputed, and delayed. Even when innovation succeeds, institutions do not always know how to respond.

A century later, war catalysed innovation. Napoleon Bonaparte, seeking to feed his armies, offered 12,000 francs for a reliable food preservation method. The result? Nicolas Appert’s pioneering work on canning. With his innovation, food for armies could be preserved for months and years, and could keep armies fed on long campaigns to distant lands. Explorers and sailors started depending on them, opening up new frontiers of discovery. Canned food gave a fillip to farmers, now that their produce could have extended lives. And brought convenience to dining. One competition, one process, many benefits!

These early contests reveal something important: prizes work best when the problem is urgent, the goal is clear, and the reward is meaningful enough to sustain effort over time.

Rainhill Trials

Fast forward to the industrial age. The directors of the Liverpool and Manchester Railway had originally intended to use stationary steam engines to pull trains along the railway using cables. However, their engineer George Stephenson strongly advocated for the use of steam locomotives instead. As the railway was approaching completion, the directors decided to hold a competition to decide whether locomotives could be used to pull the trains. The Rainhill Trials of 1829 offered a prize of £500 for the best way to haul the trains. George Stephenson’s Rocket won decisively, and its design quickly became the standard for locomotives. Here, the feedback loop between competition and adoption was almost immediate.

Then came the age of flight. The Raymond Orteig Prize promised $25,000 for a nonstop transatlantic flight. Charles Lindbergh claimed it in 1927—but only after multiple failed attempts and fatal crashes by others. The prize went to Lingberg, but more importantly, it accelerated aviation as an industry.

By the late 20th century, competitions had evolved into global innovation platforms. The XPRIZE Foundation’s Ansari X Prize offered $10 million for private human spaceflight—and catalysed over $100 million in investment before it was eventually won. The DARPA Grand Challenges, with prizes of $1–2 million, helped lay the groundwork for self-driving cars.

And in the digital age, contests have become even more distributed. The Netflix Prize offered $1 million to improve its recommendation algorithm—successfully claimed, and now foundational to digital platforms. Competitions on Kaggle for machine learning and data science challenges are designed to solve complex, real-world problems using crowdsourced predictive modelling. They routinely offer prizes ranging from a few thousand dollars to over $1 million, with winning models often deployed in real-world systems.

Not all prizes, however, are claimed. The Google Lunar X Prize, sponsored by Google, famously went unawarded when no team met the deadline. And yet, several participating teams went on to become serious space ventures. More recently, the rebooted Longitude Prize on antibiotic resistance—run by Nesta with a purse of £10 million—was eventually awarded after years of global effort.

Enter the Hackathon: The New-Age Contest

If prizes defined earlier centuries, hackathons define ours.

From college campuses to corporate offices, hackathons have become the default format for innovation challenges. India, in particular, has embraced them at scale through initiatives like the Smart India Hackathon, where winning teams typically receive ₹1–5 lakh, along with visibility and recognition.

At first glance, hackathons look like a natural continuation of the prize tradition. But look closer, and a crucial distinction emerges.

Hackathons are built for speed. Typically compressed into 24 to 72 hours, they excel at generating ideas, prototypes, and energy. They uncover talent and encourage collaboration. But they are not designed for depth.

The breakthroughs that defined earlier prize competitions were the result of years of iteration, backed by incentives large enough to justify sustained commitment. Even modern competitions on Kaggle run for months, allowing refinement and optimisation. Hackathons, by contrast, often end at the stage of a promising prototype.

This is not a weakness. It is a different role.

Hackathons are the sparking mechanisms of the tech world.

Lessons for India: Moving from Events to Ecosystems                            

India is no stranger to ingenuity—though often of the jugaad class. We could surely use the powerful lever of structured, sustained innovation contests.

1. Define Grand Challenges That Matter Locally
India’s problems—air pollution, water scarcity, affordable healthcare—require sharply defined challenges and serious prize money. Rewards must be large enough to sustain effort beyond a weekend.

2. Open Participation Beyond Credentials
Breakthroughs often come from unexpected quarters. Platforms must include informal innovators, practitioners, and non-traditional problem-solvers.

3. Build a Pipeline, Not One-Off Events
Hackathons should be the starting point, not the endpoint. Without this pipeline, ideas from initiatives like the Smart India Hackathon risk fading away.

4. Shift from Inputs to Outcomes
Prize systems reward results, not proposals—encouraging creativity and reducing bureaucratic inertia.

5. Invest in Follow-Through
Mentorship, funding, and testing environments are what convert prototypes into deployable solutions.

6. Measure Success Beyond Winners
India must move beyond a binary view of success. Even if a prize is not claimed, the ecosystem it builds can be valuable.

Because sometimes, all it takes to change the world…is not just a prize—but a prize large enough, a timeline long enough, and a system strong enough to turn ideas into impact.

–Meena                  

Pic: http://www.rainhilltrials.org/

A Just War: St. Augustine of Hippo

Pope Leo XIV’s recent visit to Algeria had a special personal significance for His Holiness. The current Pope is the first Augustinian Pope; he is a member of the Order of St. Augustine. The Pope visited the archaeological site of ancient Hippo, (modern day Annaba) where once stood the Basilica of Peace church where St Augustine was Bishop was 34 years. He described it as “a profoundly emotional and spiritual moment”.

Who was St Augustine, and why is he significant even today, even though he lived 16 centuries ago? 

St. Augustine of Hippo was a theologian, writer, preacher, rhetorician, and bishop. Before he became a saint, Augustine’s life saw many different phases. He was born in 354 AD in Thagaste, presently called Souk Ahras in modern-day Algeria, which was then a part of the Roman Empire. He was African, and not white, as later depicted. His mother Monica was a devout Christian, and his father, a tax collector who converted to Christianity only on his deathbed.

Augustine was bright student, and he had great rhetorical skills; he travelled to Carthage to study rhetoric. But in his teenage years he was irreligious and led a wild life, in spite of his mother’s prayers and counselling. In his twenties Augustine was restless and in his quest to discover ‘the truth’ he experimented with the cult of Manichaeism, a concoction of Christian, Buddhist, astrological and pagan elements. He then became influenced by Neo-Platonism, which drew from Plato. Augustine was still lost and wandering, much to his mother’s despair.

His wandering led him to Milan, where an encounter with the bishop of Milan, Ambrose, changed his life path. The bishop, who was considered one of the greatest orators in the Roman world influenced Augustine, at the age of 33, to convert to Christianity in AD 347. Augustine was ordained a priest in 391 and he became Bishop of Hippo (today’s Annaba, Algeria) in 395.

After his conversion, Augustine used his skills as a thinker and writer in service to the church. He moved back to North Africa and eventually became bishop of a town named Hippo. He continued not only to speak eloquently, but was also a prolific writer. His personal journey, the story of his spiritual quest, formation and conversion, is documented in The Confessions. Written in 401, this work is a model for modern autobiography as it depicts the formation of a mind and character.

St. Augustine also wrote several other noteworthy books. Perhaps the most relevant today, are his thoughts and writings on war and peace. He was one of the first people to articulate a philosophical statement of war and justice. Augustine drew upon Christian teachings as well as Greek and Roman philosophy to propose a set of principles that defined a ‘just’ versus an ‘unjust’ war. He laid the groundwork for what became known as the Just War Doctrine.

St. Augustine lived in volatile times. In a period when there was a race to expand Empires, St. Augustine questioned the value of such expansion, comparing it to a human body, arguing that a moderate stature with good health is preferable to an oversized, unhealthy body (and by extension, empire).

He viewed war with deep sorrow and profound skepticism regarding its ultimate value. Acknowledging that war is inevitable in some situations, he laid down certain criteria under which war is a permissible recourse. 

Just Cause: to confront “a real and certain danger” to protect innocent life.

Competent Authority: declared by those with responsibility for public order.

Comparative Justice: Are the values at stake critical enough to override the presumption against war?

Right Intention: War can only be conducted to satisfy the just cause.

Last Resort: All peaceful alternatives have already been exhausted.

Probability of Success: The outcome cannot be disproportionate or futile.

Proportionality: inflicted damage must be proportionate to the good expected.

St. Augustine also drew upon earlier Roman conventions about war. For example, that it must be declared by proper authorities and not by angry mobs. The Empire also had a policy — not always followed — of allowing conquered people to keep their own customs, religions, and local laws.

Which wars are not just? These would be wars marked by selfish intent. Wars to seize territory, wealth, or power are not just. Wars fought for personal glory or out of vindictiveness are not just.

While he agreed that a ‘just war’ could be fought to restore peace or punish injustice, he viewed it as a ‘necessary evil’ rather than a good. He maintained a fundamental belief in the futility of war for achieving lasting, meaningful human happiness.

As early as the fifth century St. Augustine was considering the moral consequences of war. For Augustine, any conflict that does not have the final establishment of a stable, just peace as its sole purpose, is useless and immoral.

Writing in 418 A.D., he wrote: “Peace should be the object of your desire; war should be waged only as a necessity…in order that peace may be obtained. Therefore, even in waging a war, cherish the spirit of a peacemaker, that, by conquering those whom you attack, you may lead them back to the advantages of peace…As violence is used toward him who rebels and resists, so mercy is due to the vanquished or captive.”  

In his book City of God, he noted that even a ‘just war’ is characterized by cruelties and miseries. He viewed the brutality of war with profound lament and often expressed a “deep hatred of war” and contempt for those who glorified military victories. The Just War doctrine emphasizes that a set of rules for military combat must be followed. This means treating non-combatants such as women, children, elderly, wounded, and prisoners of war humanely.

The Just War theory has become an integral part of Western philosophy. Even today it is reflected in international humanitarian law.

St. Augustine regarded the power-seeking know-it-alls of the day as ‘dangerous fools armed with the pretense of knowledge’. In our current turbulent period of history, we are witness to how international and humanitarian law is being flouted with arrogance and impunity by exactly the same kind of people.

It is a time to remember another valuable insight from St Augustine–The idea that you were bigger or better, or more self-righteous, or somehow immune from the rules that govern others — the absence of humility, in other words, gave you license to do unto others what you would never allow them to do unto you.

Just War?  Or A Just War?

–Mamata

The Lady at the Helm: Sumati Morarji

Among the many days, significant or silly, that are being celebrated, this week marked a special day in India’s maritime history. April 5 is celebrated annually as National Maritime Day.

India, a peninsular subcontinent with more than 7000 km of coastline, has a long maritime history, dating way back to the Indus Valley Civilization. Since ancient times Indian sailors ventured out to sea thanks to their deep understanding of the ocean patterns and the monsoon winds. This allowed them to travel safely and efficiently, opening up trade routes to distant lands. It is speculated that the English word “navigation’ may have its roots in the Sanskrit ‘navgati’, a combination of ‘nav’ meaning ship or sailing vessel, and ‘gati’ meaning speed or progress. There are many stories in Indian mythology about the seas and oceans, and proof of Indian maritime operations can be found in Indian literature, sculpture, painting and archaeology.

The advent of the colonial powers replaced the traditional trade and trading vessels with what became a European monopoly. Over the next few centuries British companies dominated the shipping industry. In the early 1900s a far-sighted group of Indian industrialists, led by Walchand Hirachand, and including Narottam Morarji, Kilachand Devachand, and Lallubhai Samaldas dreamed of creating India’s own mercantile fleet—a swadeshi shipping enterprise.

They formed a company called the Scindia Steam Navigation Company, and purchased a steamer from the Gwalior royal family. The ship, the RMS Empress, originally purchased from the Canadian Pacific Railway had been used as a hospital ship for wounded Indian soldiers in World War I. It was a challenge to set up and sustain a marine mercantile enterprise in the face of the long-running British companies.

RMS Empress was renamed the SS Loyalty. On 5 April 1919, the now totally swadeshi SS Loyalty made its maiden voyage from Bombay to London. It carried 700 passengers and cargo. This was significant as it marked the beginning of breaking the British monopoly on maritime trade. This event continues to be commemorated as National Maritime Day on 5 April every year.

What makes the subsequent story of the founding company, the Scindia Steam Navigation Company more significant, is also the story of a remarkable woman who steered this company to exemplary success.

Sumati Morarji was born in 1909 in an affluent, and conservative, merchant family of Bombay. Named Jamuna by her parents, at the age of 13, she was married, with an extravagant wedding, to Shanti Kumar Morarjee, the only son of Narottam Morarji. Narottam Morarji, an eminent industrialist, was one of the co-founders of the Scindia Steam Navigation Company.

The young bride was extremely bright, and displayed a great thirst for learning. She was also keen on understanding more about her marital family’s business and its working. Her father-in-law Narottam Morarji recognised in the newly-wed teenager a sharp mind, and hidden potential.

He renamed her Sumati (a woman with superior wisdom), and invited and respected her insights into the family business. She equally demonstrated her management skills when she took over running of the household following her mother-in-law’s early demise. By the time she was 20, Sumati had demonstrated her capabilities in all spheres. Thus her husband Shanti nominated her to the Managing Board of Scindia Steam Navigation Company.

This was a time when the company was still in its infancy, having a few cargo ships running between India and Europe. With Sumati at the helm, the company’s strength and reputation increased greatly over the next few decades. As India was on the cusp of Independence, Sumati quietly assumed complete charge of the company, leading it from strength to strength as the newly independent nation began its journey to self-reliance and progress. 

Sumati was deeply influenced by Gandhii, and despite her business commitments took an active part in the underground operations of the freedom movement. In the aftermath of Partition, she used her ships to help safely transport Sindhis from Pakistan to India. She remained close to Gandhiji, with whom she corresponded regularly.

After Independence as Indian maritime trade was increasingly handled by Indian ships, Sumati’s insights, expertise, and experience in the field played a crucial part. She set a precedent as being the first woman in the world to head the Indian National Shipowners Association, a pioneer organisation of ship owners. She was globally recognised and elected as Vice President of World Shipping Federation in London in 1970.

Sumati Morarji managed the Scindia Steam Navigation Company for 69 years, and steered the company’s great success, until she passed away in 1998. She contributed in many national endeavours, and was a deeply spiritual person who helped in propagating Indian culture across many countries. But her primary passion was her ships, that she regarded as her daughters. No wonder, then, that she is called the Mother of Indian Shipping in every sense of the phrase.

National Maritime Day is a fit occasion to honour Sumati Morarji, a lady who quietly made waves across the oceans.

–Mamata

Rainbow Island: Hormuz

Just over a month ago, the name Hormuz did not mean much for a large population of the world. Today the word is making headlines across the globe. The blockade of the Strait of Hormuz is having a ripple effect far from the waters of the Persian Gulf. Sadly, its claim to fame is rooted in the fall-out of a war that the world did not, and does not need.

For centuries ships have been sailing through the waters of the Persian Gulf, carrying people and cargo, and perhaps, not as visibly, culture. These waters are not just transporters, they are also the space for small land formations, many of which are unique in their geology and biogeography. One such island is the island of Hormuz.

Hormuz Island is a part of the Hormozgan Province of Iran. Located about eight  kilometres from Bandar Abbas on the coast of Iran, and 18 km from Qeshm Island in the Persian Gulf, it is strategically perched where the Persian Gulf and the Gulf of Oman meet. Hormuz island covers an area of approximately 42 square kilometres.

The ancient Greeks called it Organa; during the Islamic period it was known as Jarun, and later took the name Hormuz from a significant mainland port of Ormus. It is believed that around 1300 A.D. the ruler of this town and its inhabitants shifted to the island in order to evade attacks by Mongolian and Turkish troops. Thus they called their island home New Hormuz. Today it continues to be recognized simply as Hormuz.

Its strategic location at the entrance of the Persian Gulf has historically made Hormuz Island a key trading post. It was an important stop for traders on the Silk Road. In the 15th century a Russian merchant described it as “a vast emporium of all the world”.  

It was part of a flourishing kingdom during the medieval period. This attracted the attention of the European powers. In the early 16th century it was taken over by the Portuguese who established a fort there, which helped the Portuguese to control strategic trade routes between Europe, India, and the Far East, and dominate the spice trade.

It remained a Portuguese colony for almost a century and a half, before being taken over in 1622, by the Safavid Empire, run by a powerful Iranian dynasty. It has been a part of the Persian empire since then, and is today a part of Iran.

Even today the island’s culture and customs reflects a blend of the Arab influences from neighbouring countries, as well as past interactions with Portuguese, English, and Indian merchants. The island today is home to Persians Arabs, and the indigenous Hormuzis, each contributing to the rich cultural mosaic.

What makes Hormuz Island unique are its dazzling geological features. It is one of the biggest salt domes—where a mound of salt layers rises up through overlying layers of rock. The formation glows with brilliant shades of red, yellow and orange. This palette is created by the deposition of minerals which constitute the layers of shale, clay, and volcanic rock. Geologists believe that hundreds of million years ago shallow seas formed thick layers of salt around the margins of the Persian Gulf. These layers gradually collided and interlayered with mineral-rich volcanic sediment in the area, leading to the formation of the brilliantly coloured soil and mountains. The high percentage of minerals, including iron, gypsum, oligiste, apatite and quartz has given colour to all the geographical features—ochre-coloured streams; beaches with sand ranging from white, silver and gold, to crimson; the Rainbow Mountains whose vibrant colours range from deep reds and oranges to purples and yellows, and natural salt caves that resemble vibrant works of art. The beaches are covered in crimson sand, and when the waves wash over these, the water also takes on reddish and pink hues. Because the beach’s red glow could be spotted from far out at sea, sailors once used it as a natural navigation marker in the Strait of Hormuz.

The geological formations on the island include the Valley of Statues, a surreal landscape with natural rock formations; Silence Valley which is has an eerie landscape of salt formations and is completely silent; and the Cave of the Salt Goddess formed over thousands of years by water erosion and salt crystals, the walls of which shimmer with vibrant shades of whites, blues, purple and pinks.

In this stunning landscape, the Red Mountain stands tall. This is even more unique because not only is its soil is bright red, it is also edible!  

The red soil is caused by haematite, an iron oxide which comes from the islands volcanic rocks. The mineral is valuable in the production of cosmetics, paper, plastic, stainless steel, ceramics, tiles, pottery and glass, and was exported for this, but the exports have now been reduced to prevent overexploitation. This soil is found nowhere else in the world; however it is not permitted to carry any samples out of the island, even as a souvenir.

The soil plays an equally important part in local cuisine! Known as Gelack, it is an important ingredient, in local cuisine. It is used as a spice, especially in the traditional fish curry called sooragh. A rare edible soil that is a valued spice in itself!  

If Hormuz Island is renowned for its geological features, it is equally rich in biodiversity. A patch of mangroves at its northern end adds a touch of green to the vibrant palette. The island ecosystem harbours a rich variety of bird species, and the waters are home to thriving marine life.

With its kaleidoscope of natural colours, Hormuz aptly deserves the title of Rainbow Island. Sadly the rainbow is today eclipsed by the dark clouds of war.

-Mamata

Of Birds and Birdwatchers: International Bird Day — 1 April

April 1 traces back to the International Convention for the Protection of Birds, one of the earliest international efforts to formally recognise the need to conserve avian life. On International Bird Day, observed each year on April 1, the focus is usually on birds themselves—their fragile habitats, their migrations, their role in holding ecosystems together and the threats they face.

But today we are looking more at the people who watch birds and their idiosyncrasies.

Every obsession develops its own private vocabulary. Birdwatchers—or birders, as they prefer—have taken this instinct a step further. Over time, they have shaped a dictionary so distinctive that to outsiders it can sound faintly eccentric. Words here behave differently: they slip free of their everyday meanings, acquire new ones, and quietly signal who belongs.

Consider twitching. To most people it suggests nervousness or involuntary movement. In birding, it means deliberate, often hurried travel to see a rare bird reported elsewhere. A twitcher is someone who drops everything at short notice and sets off, binoculars in hand to chase that possibility. The term emerged in mid-twentieth-century Britain, when news of sightings spread through phones and handwritten notes, carrying with them a sense of urgency and barely contained excitement.

This habit of birders to repurpose language runs deep. A tick is not a parasite but a small victory—a species added to one’s personal list. A lifer marks a first-ever sighting, the kind that stays with you. A dip, on the other hand, captures a very specific disappointment: travelling all that way and missing the bird.

Some of the most intriguing expressions describe perception. Jizz—sometimes softened to giss—refers to the overall impression of a bird: its shape, posture, movement, the rhythm of its flight. You may not catch every marking, but you recognise it instinctively. “It had the jizz of a harrier.” There is no real substitute for this word, which perhaps explains why birders defend it with quiet determination.

Then come the social terms, edged with humour. A stringer is someone suspected of stretching the truth about sightings—their records “on a string.” A lagger arrives too late. A gripper is not an object but a bird so rare it inspires envy, and to be gripped off is to feel that envy keenly while still offering polite congratulations.

Even equipment is linguistically reshaped. Bins are binoculars. A scope is a spotting-telescope. To lock on is to get your optics trained on the bird before it disappears. These are practical words, forged in moments where seconds matter.

Place, too, carries its own vocabulary. A patch is a birder’s regular haunt, revisited across seasons and years. A stakeout involves waiting patiently at a known location. Suppression refers to an ethical choice—not publicising a rare bird’s location if attention might disturb it. To flush a bird is simply to make it fly off, usually by getting too close, and is generally frowned upon.

What stands out is how emotionally evocative this language is. It does not just describe birds; it maps the experience of pursuing them. Like any specialised language, birding slang creates community. To know the terms is to belong; to learn them is to enter gradually. Yet many of these words travel beyond their niche. Twitching now describes reactive behaviour more broadly. Jizz has been borrowed into design and art. Patch has found a life in other forms of local attachment.

Colour, in birding, acquires a precision that everyday language rarely demands. Birds are not simply brown or grey; they are rufous, buff, ochre, slate, ashy, olive, chestnut. A drab-looking bird, on closer inspection, becomes a composition of tones—warm on the flanks, cooler on the crown, a faint wash along the breast. These are not ornamental choices of words but functional ones, allowing birders to separate one species from another in seconds. To say “yellow” is often useless; to say “sulphur-yellow with a greenish wash” is to narrow the field.

Even familiar colours are subtly reworked. A “black” bird may, to a birder, show glosses of blue or green; a “white” wing might carry a hint of cream or grey that matters enormously in identification. Terms like supercilium (the eyebrow stripe), mantle (the upper back), and primaries (the outer flight feathers) turn the bird’s body into a map where colour is carefully located, not loosely described. Over time, birders learn to see in these finer gradations, and the language follows suit—less about naming colours as we know them, more about learning to see them as birds wear them.

Some colour words have travelled the other way—borrowed not to describe birds, but from them. Teal is the most familiar example, a word that once referred primarily to the small freshwater duck, the Eurasian teal, whose striking greenish-blue patch lent its name to a shade now used everywhere from fashion to design. What began as a bird became a colour, and then quietly detached itself, so that many people use “teal” today without any awareness of its avian origin.

This is not an isolated case. Duck-egg blue, robin’s egg blue, and peacock blue all carry traces of the natural world into everyday speech. The Indian peafowl, for instance, has given us a whole palette of iridescent blues and greens, while the soft tint of a robin’s egg has become shorthand for a particular pastel. In these instances, birding has quietly shaped how colour is named and imagined—proof that even those who never lift a pair of binoculars are, in some small way, speaking a language borrowed from birds.

In an age where English is increasingly standardised, birdwatching offers a reminder that language still evolves wherever people care deeply enough. These words were not coined for effect. They emerge out of necessity—to express experience, to share feeling, to laugh gently at oneself.

So this International Bird Day, stand quietly at the edge of a wetland or in your garden. Watch the birds and hopefully, you will get a tick!

–Meena

Pic: BNHS https://www.bnhs.org/nature-trails-details/

Dedicated to Serve: Dr Ida Scudder and Christian Medical College, Vellore

A young American girl, born and brought up in a missionary family in a small town in Tamil Nadu was expected to continue the family’s tradition of service to the neediest of the people. Ida Scudder, born in 1870, the only sister to seven boys, was exposed at an early age to the poverty and deprivation of the local population through her parents’ work.  But Ida was repelled by all this. She was young and pretty, and dreamt of enjoying life, and eventually making a comfortable marriage. Her parents, both long-time missionaries in South India returned to the United States for a few years with their large family when Ida was eight years old. The comfortable life in America was a huge change from the challenging missionary work in India. After a few years of school, Ida moved to the Northfield Seminary for Young Women in Massachusetts while her parents returned to Tamil Nadu.

When Ida was 20 years old she came to visit her ailing mother in Tamil Nadu. While she was there, one night three different men came to seek medical help for their wives who were about to deliver, and were in distress. They appealed to Ida to attend to them. Ida had no medical training; her father was the doctor in the family. But the conservative community would not let their women be treated by a male. The next morning Ida heard that all the three young women and their babies had died in the night due to lack of medical attention. This was a life-changing experience for Ida. She found her calling.

But in order to be in a position to really help women medically, Ida herself had first to undergo medical training. She returned to the United States and enrolled in the Philadelphia Women’s Medical College, and studied further at Cornell Medical College where she was among the few female students. After 10 years of rigorous study and training she returned to India where she hoped to work alongside her father. Sadly, her father died not long after her return.

But Ida was here to stay. She determined to carry on his work, now focussing on women’s health. Her vision was that women should have the same access to quality and compassionate healthcare that men did, regardless of religion and ability to pay for it.

She began her practice from her family home in Vellore, 135 km west of Madras, by opening a small clinic for women. Ida was initially unsure how her presence and engagement would be received by the local community; but patients trusted her, and the numbers grew greatly.

A donation from an American who wanted memorialize his late wife, led to the building of the 40-bed Mary Taber Schell Memorial Hospital for women in 1902. Ida also started organizing roadside medical camps in villages around Vellore, travelling across difficult terrain to treat people and give health education.

Given the huge need and demand for medical care for women, Ida realized that as a single person there was only so much that she could achieve. It was critical to train and educate more people in this field. In 1903 she started to train compounders, and in 1909 nurses. Her vision was to set up a world class medical college. Many scoffed at such an ambition, but Ida was tenacious and managed to raise funds to support her cause.

The Union Mission Medical School for Women was set up in Vellore in 1918. Sceptics felt that there would be no takers. But the very first year there were 150 applications, and 18 women were selected for the first batch who went on to secure a Medical Practitioner Diploma.

Dr Ida Scudder’s words to the first batch of graduating students to pass out, reflect her professional dedication, her tenacity, as well as her missionary spirit: “You will not only be curing diseases, but will also be battling with epidemics, plagues and pestilences and preventing them. Face trials with a smile, with head erect and a calm exterior. If you are fighting for the right and for a true principle, be calm and sure and keep on until you win.​”

In 1938 the British Government announced that it would only recognize an MMBS degree, and not a diploma. This necessitated that Ida’s medical school, be upgraded to a medical college. Thus was born the Christian Medical College of Vellore. The original women’s college also became co-educational in 1945. Ida was completely engaged in every aspect of the institution—teaching, medical practice, as well as administrative responsibilities including fund raising.

Even after Independence, Christian Medical College continued to draw dedicated doctors from across the country and abroad. They came not for money or glory, but inspired by the founder Ida Scudder and her single-minded dedication to the cause of service to the sick.

Over a century after Dr Ida Scudder sowed the seeds that gave form to her vision, her legacy has blossomed into a spreading banyan tree. The tiny clinic has grown into CMC Vellore—one of India’s top-ranked educational, healthcare and research institutes.

The 40-bedded hospital has grown into a 3000-bedded multi-specialty health care system spread over six campuses. CMC cares for over twenty lakh patients, and trains one thousand doctors, nurses and other medical professionals each year. People from all walks of life and all parts of the country and beyond come here for the ethical, compassionate, and quality care that it is reputed for. Ida Scudder’s vision and work have outlived her.

This month, we have been celebrating women who have broken barriers, and led the way in many different ways, in widely diverse fields. We have shared stories of women who have truly “made a difference.” Who better epitomizes this than Dr Ida Scudder! 

–Mamata

Rebel Nomad: Isabelle Eberhardt

Continuing our celebration of path-breaking women this month. Through history, women have often been denied rightful recognition for their contribution in different fields. In STEM, their significant work has been eclipsed by the attention and glory garnered by men. While many of these female scientists and researchers are equally present in labs, they tend to lose visibility as their achievements advance. These achievements have been ignored, minimized, or credited to men. 

There is another band of women, who have had to make efforts to disguise their real persona in order to pursue their passions. This has been the story of several women who have stepped into what is traditionally considered a ‘male domain’. Among these are women explorers who have boldly ventured into dangerous terrains on perilous missions; women who broke conventional barriers in more ways than one.

One such story is that of Isabelle Eberhardt—journalist, writer, explorer-adventurer, and rebel. Today she would also be identified as ‘feminist’.

Isabelle was born in February 1877 in Geneva, Switzerland. Her mother Nathalie was the daughter of a German and a Russian Jew. There is some uncertainty about Isabelle’s real father, but she always considered her mother’s husband Alexandre Trophimowsky as her father. Isabelle was taught by Alexandre, who was a tutor. She studied philosophy history and geography, and also learned many languages including French, Russian, German, Italian, Latin, Greek, and classical Arabic. She loved literature and read a great deal. Alexandre had liberal views and gave her a lot of freedom to explore, and develop, her own personality.

When she was 17 Isabelle started correspondence with a French officer in the Sahara desert wanting to know, in detail, all about life there. This triggered in her the yen to explore for herself. In the meanwhile, based on her correspondence, she began to write short stories about the region under a male pseudonym Nicolas Podolinsky. These were published in a magazine. By now Isabelle was eager to see and experience for herself.

She met a photographer from Algiers who offered to help her move to Algiers. In May 1897 Isabelle and her mother moved to Bone in Algeria. Isabelle was 20 years old. Both mother and daughter were distressed by how the colonial Europeans treated the local Arab population. They rented a house in the non-European part of town. This was an area where women were not expected to go out alone or without a veil. So Isabelle started wearing a burnous and a turban (the dress worn by the local men). She quickly became fluent in Arabic. She and her mother also converted to Islam. Isabelle found it easy to accept Islam because she believed in fate, and Islam gave meaning to this belief. Isabelle’s unusual lifestyle caused the French colonial settlers and officials to suspect her of being a spy.

Isabel converted her observations and experiences into fictional stories, some of which were published. Her mother’s death in the same year that they had moved to Algiers, was a great blow. She returned for a while to Geneva to look after her sick father. After he died, she mortgaged the family property and returned to Africa in 1900, and began to lead a nomadic life. She wandered restlessly in North Africa, usually alone, writing her diaries, stories and travelogues. To freely experience everything as a native would, she wore only male clothing, joined a local Sufi group, and even changed her name to Si Mahmoud Saadi. During her travels she met and fell in love with an Algerian soldier Slimane Ehnni. This heightened the suspicion of the French authorities. Isabelle continued to court danger; she was attacked and severely wounded by a man with a sword. She was ordered by the French to leave North Africa, and went back to France where she could barely make ends meet by working as a dock worker, disguised as a man. Meanwhile she continued with her writing.

A friend introduced her to Eugene Brieux, a writer who supported Arab freedom. He tried to publish her stories, but there was no market nor support for pro-Arab stories. The only ray of light was when Slimane Ehnni was transferred to a military unit near Marseilles. They did not need permission to marry in France, and the two married in 1901. The next year, her husband left the army and the couple returned to Algeria.

Back in Algeria she started working for the Al-Akhbar newspaper. Her novel Trimardeur also began to appear in parts in the paper. She worked hard, but only when she felt like it; and spent all her money on tobacco, books, and gifts for friends. She travelled for long periods on assignments.

Isabelle continued to lead an erratic life, travelling in perilous conditions; she indulged excessively in drink and drugs. But amid all this, her writing still occupied a central part of her life. Her articles and short stories appeared in the local press, and for a while she also wrote a regular column on the customs of Bedouin tribes. In 1903 when reporting on a battle she met the French general Hubert Lyautey, and helped him communicate with the local Arabs because of her fluency in Arabic.

Isabelle’s nomadic and often promiscuous lifestyle took a severe toll on her mental and physical health. By 1904 she was so spent and weak that she was admitted to a military hospital in Ain Sefra. Some weeks later she discharged herself, against medical advice, to meet with her husband who she hadn’t seen for almost a year. The very next day the town where they had rented a mud house was struck by a flash flood. Isabelle was missing. General Lyautey ordered a search; Isabelle’s body was found later, pinned under a beam of the house, and surrounded by the soggy pages of her latest manuscript. Isabelle was buried in Aïn Sefra with a marble tombstone with her adopted Arabic name and her birth name in French. Isabelle was only 27 years old; an untimely end to a short and tumultuous life.

The General tried to collect as many of Isabelle’s unpublished writings as he could manage to find. These were later published. Her first published story after her death, Dans l’Ombre Chaude de l’Islam (In the Warm Shadow of Islam), was highly praised in 1906. This book made Isabelle famous as one of the best writers about Africa. Streets were named after her in Béchar and Algiers.

Today Isabelle is perceived as an early feminist and anti-colonialist. In her own time, she was simply a woman ahead of her times—adventurer, chronicler, gender bender, and one who lived on her own terms.

–Mamata