Monday, March 23, 2026

India and the Possibility of the World’s Largest Cryptographic Identity Network

This post is written by ChatGPT...

In the early decades of the 21st century, India quietly built one of the most ambitious digital infrastructures ever attempted. Three pillars of this ecosystem already exist: the biometric identity platform Aadhaar, the mobile SIM network connecting over a billion devices, and the revolutionary real-time payment system UPI. When combined with modern cryptographic techniques such as digital signatures, these components could evolve into the world’s largest cryptographic identity network.

The Three Existing Pillars...

The first pillar is Unique Identification Authority of India’s Aadhaar, the world’s largest biometric identity system. Aadhaar provides a unique identity number to more than a billion residents of India. It links biometrics—fingerprints and iris scans—to a digital identity record. Aadhaar was designed primarily as a platform for authentication rather than merely an identification card.

The second pillar is India’s massive telecom infrastructure. Every mobile device connects through a SIM card (or increasingly an eSIM), which is essentially a tiny secure computer embedded in the phone. The SIM contains cryptographic keys used by telecom networks to authenticate subscribers. In effect, every mobile phone already carries a hardware security module in miniature.

The third pillar is the digital payment revolution led by National Payments Corporation of India, which created the Unified Payments Interface. UPI allows instant bank transfers between individuals using a smartphone. It relies on device binding, encrypted communication, and PIN-based authentication rather than repeated OTP verification for each transaction.

Together, these three systems form a powerful foundation.

Adding the Fourth Layer: Cryptographic Identity

The missing layer is digital signatures, a fundamental tool in modern cryptography. A digital signature allows a device or user to prove identity mathematically without revealing secret information. Instead of sending passwords or OTP codes, a device can sign a cryptographic challenge using a private key stored securely in hardware.

If such a key were stored inside a SIM card or secure element in a phone, the phone itself could act as a trusted identity device.

In simple terms, the architecture could work like this:

1. Aadhaar verifies a citizen’s identity once through biometric enrollment.

2. A cryptographic identity token is issued to the person’s mobile device.

3. The private key is stored securely in the SIM or device hardware.

4. When authentication is required, the device signs a challenge from the server.

5. The server verifies the signature using the corresponding public key.

No OTP is required because the identity proof is cryptographic.

How UPI Demonstrates the Concept

UPI already uses a simplified version of this idea. When a user installs a UPI application, the system verifies the phone number through the SIM and binds the device to the bank account. After that, transactions require only a PIN and device authentication. The system implicitly trusts the device-SIM combination.

Scaling this model to identity services could create a unified authentication framework for government, finance, healthcare, education, and digital commerce.

Why India Is Uniquely Positioned

Few countries possess the ingredients required to build such a network at national scale.

India has:

- Over a billion Aadhaar identities

- One of the world’s largest mobile subscriber bases

- A mature digital payments infrastructure through UPI

- A rapidly growing smartphone ecosystem

This combination makes India uniquely capable of deploying a cryptographic identity platform serving hundreds of millions of people simultaneously.

Potential Applications

A cryptographic identity network could transform many sectors.

Banking and finance could eliminate OTP fraud by using hardware-based authentication. Government services could verify identity instantly without repeated document submissions. Digital signatures could make contracts legally binding online. Healthcare systems could securely share patient records while preserving privacy.

Even e-commerce and social platforms could use cryptographic identity to prevent fraud and impersonation.

Challenges and Concerns

Despite the technological promise, such a system raises important policy questions. Privacy is a major concern. Citizens must have control over when and how their identity is used. Safeguards must prevent telecom operators or device manufacturers from accessing personal identity data.

Security risks also exist. SIM swap fraud, device theft, or malware could potentially compromise identity tokens if not carefully designed. Robust cryptographic protocols and hardware protections would be essential.

Legal frameworks would also need to evolve to define the status of digital signatures generated through such a network.

The Vision Ahead

If implemented carefully, India could create the world’s largest distributed trust network—an infrastructure where identity verification happens instantly through cryptography rather than paperwork or SMS codes.

In such a system, a smartphone would become more than a communication device. It would function as a secure digital passport for everyday life: banking, government services, healthcare, education, and commerce.

The technological pieces already exist. The challenge now is integrating them in a way that preserves both security and individual freedom.

If achieved, India’s digital public infrastructure could become a model for the rest of the world—a demonstration that a nation of over a billion people can build a secure, inclusive, and scalable identity network powered by cryptography.

Sunday, March 22, 2026

The suits vs engineers problem - the lesson...

IBM built Deep Blue, a specialized chess supercomputer.

In the historic Kasparov vs Deep Blue 1997, Deep Blue won the match 3.5–2.5.

It was the first time a reigning world champion lost to a computer under standard tournament conditions.

But interestingly, Deep Blue was not “AI” in the modern sense.

It mainly relied on:

Massive brute-force search (millions of chess positions per second)

Expert-crafted evaluation functions

Special hardware chips for chess calculations

Now let's come back to the point.

During the early decades, IBM was extremely engineering-driven. Its research labs produced major breakthroughs:

IBM System/360 — one of the most influential computer platforms ever built

Hard-disk storage technology

RISC processor research

The chess supercomputer IBM Deep Blue

But in the late 1980s–1990s, IBM faced huge competition from cheaper personal computers and software companies.

Then...

Bureaucracy increased

Decision-making shifted toward business managers

Risk-taking in engineering declined

The moment suits started dominating engineering, it's all over for IBM.

Always remember great technology companies are built by engineers but often slowly taken over by managers.

The healthiest companies try to keep technical leadership at the top.

Think about Elon Musk and his companies.

Think about NVIDIA.

Think about modern AI based giants.

These are all engineer driven.

Will sometimes in the future, they all will travel the same path as IBM?

Let's keep our fingers crossed...

Only time will be able to answer this...

Saturday, March 21, 2026

God favours the brave - the story of Steve Jobs...

The idea is simple:

  • People who take initiative

  • People who act with courage

  • People who take calculated risks

are more likely to succeed than those who hesitate.

The Story



Around 1967–1968, young Steve Jobs was building a frequency counter (an electronic device used to measure the frequency of signals) for a school project. He needed some spare electronic parts but did not have money to buy them.

So he did something bold.

He looked up the phone number of Bill Hewlett — the co-founder of Hewlett‑Packard — in the telephone directory and called him directly at home. 📞

Jobs later described it roughly like this:

“I was 12 years old and building a frequency counter. I called Bill Hewlett and said I needed some parts.”

Hewlett’s Reaction

Instead of ignoring the call, Bill Hewlett was impressed by the young boy’s curiosity and initiative. He:

  • Spoke with Jobs for about 20 minutes

  • Sent him the electronic parts he needed

  • Offered him a summer job at Hewlett-Packard

That summer job allowed Jobs to work with engineers and see real electronics manufacturing.

Why This Story Is Important

It illustrates several traits that defined Steve Jobs later:

  1. Boldness – not afraid to contact top people

  2. Curiosity about technology

  3. Persistence in building things

  4. Learning from real engineers early

This mindset eventually led him to co-found Apple Inc. with Steve Wozniak in 1976.

The Lesson

Jobs often summarized the lesson as:

“Most people never pick up the phone and ask. That’s what separates the people who do things from the people who just dream about them.”

Indian philosophy also expresses this idea:

  • “उद्यमेन हि सिद्ध्यन्ति कार्याणि”
    (Success comes through effort and initiative.)

  • In the Bhagavad Gita, action and courage (karma) are emphasized over fear and hesitation.

The deeper lesson

Opportunity rarely comes to the passive.
It often comes to those who step forward first.

A nice way to put it:

“The timid wait for opportunity.
The brave create it.”

Saturday, March 14, 2026

Unusual death of Homie Bhaba and Vikram Sarabhai, Nambi Narayan falsely charged, Tapan Mishra, a top ISRO scientist poisoned, and countless Indian scientists died in mysterious condition - it's not conspiracy theory - its eye opening for Bharat to take action....

Homi J. Bhabha died in an Air India crash on Mont Blanc just months after publicly stating India could produce a nuclear device in 18 months. No debris from the crash was ever recovered. Allegations of CIA involvement have persisted, notably fueled by claims in the book Conversations with the Crow.

Sab kuch dikhta nahi hai... People of Bharat... wake up...

Now let's discuss the unusual death of Vikram Sarabhai - the pioneer of Indian Space Research...



 April 1968, this photograph was taken at ISRO’s Ahmedabad center.

Scientist Vikram Sarabhai is explaining India’s satellite project to Sonia Gandhi. Just two months earlier, in February 1968, Sonia Gandhi had come to India after her marriage. At that time, she had not even taken Indian citizenship. According to the rules, a foreign citizen is normally not allowed to enter such sensitive research centers. Antonia Maino, an 8th-grade-educated woman from Italy, suddenly developed an interest in satellites and space science right after coming to India. Isn’t that surprising? Coincidentally, a few years later, Vikram Sarabhai was found dead under mysterious circumstances in a room at a resort in Kerala. Without any postmortem, it was declared that the death occurred due to a heart attack.

Now, let's look at the Nambi Narayanan Case.

The 1994 "ISRO Spy Case" is no longer a matter of theory but a documented legal reality of false prosecution.

The Charge: Narayanan was accused of leaking cryogenic engine secrets to Pakistan.

The Reality: The CBI later found the case to be completely fabricated. In 2018, the Supreme Court of India awarded him ₹50 lakh in compensation, noting he was "arrested unnecessarily, harassed and rendered underwent mental cruelty."

The Impact: Many believe the case was a deliberate attempt by foreign entities to derail India's cryogenic technology, which was set back by nearly 15 years.

Next... Tapan Misra...

In 2021, senior ISRO scientist Tapan Misra claimed he was poisoned with arsenic trioxide in May 2017 during a promotion interview at ISRO headquarters.

He described symptoms of severe skin loss, kidney issues, and breathing difficulties.

Misra suggested the attack was an attempt to eliminate him because of his contribution to Synthetic Aperture Radar (SAR) technology, which has massive military applications.

Between 2009 and 2013 alone, the Department of Atomic Energy (DAE) reported 11 unnatural deaths of scientists. While the government stated in Parliament in 2015 that none were officially "mysterious," activists and families often point to: Unexplained Murders: Scientists like M. Iyer (BARC) were found dead with internal injuries but no external marks. Apathy in Investigation: Cases are often being closed as "routine accidents" or "suicides" despite evidence of professional-grade interference. Perspective: Whether these are a series of tragic coincidences or a coordinated effort to "behead" India's strategic programs, they highlight a critical need for enhanced security protocols for the nation's intellectual assets.

So... enough is enough... People of Bharat... wake up.... don't remain as willfully blind... open your eyes and celebrate Bharatmata...

Jai Hind... Jai Bharat...

Friday, March 13, 2026

With the fall of Windows and the rise of Linux, Computer Science is gradually crawling back to its most rightful community - engineers, scientists, physicists, mathematicians...


1. The “Windows era” of computing

During the dominance of Microsoft Windows in the 1990s–2000s, personal computing became mass-market consumer technology. The focus shifted toward:

  • Ease of use

  • Graphical interfaces

  • Office productivity

  • Gaming and consumer software

Companies like Microsoft built ecosystems aimed at millions of everyday users, not primarily scientists or engineers.

As a result, a large part of software development became application programming and enterprise IT, rather than deep systems engineering or scientific computing.

2. Linux and the return of engineering culture

The rise of Linux—started by Linus Torvalds—brought back a culture closer to traditional engineering and scientific computing:

  • Open source collaboration

  • Systems-level programming

  • High-performance computing

  • Research computing environments

Today, Linux dominates areas like:

  • Supercomputers (almost all of them run Linux)

  • Scientific computing clusters

  • Cloud infrastructure

  • AI/ML systems

Even platforms like Google, Amazon, and Meta Platforms run their infrastructure largely on Linux-based systems.

3. The deeper historical perspective

Originally, computer science was indeed a scientific and engineering discipline:

  • Numerical simulations

  • Physics modeling

  • Aerospace computing

  • Mathematical computation

Think of fields like:

  • Computational Physics

  • Computational Fluid Dynamics

  • Scientific Computing

My own interests—like studying OpenFOAM, Mantaflow, GPU programming, simulation, and Julia programming language—fit exactly into this tradition.

4. What is really happening

A better description might be:

Consumer computing and engineering computing are diverging again.

  • Consumer layer → mobile apps, web, AI tools

  • Engineering layer → Linux, HPC, simulation, GPUs

And the second layer is increasingly driven by engineers, physicists, and mathematicians, especially in areas like:

  • simulation

  • AI

  • computational science

  • scientific visualization

Exactly the ecosystem I am exploring with OpenGL, Mantaflow, OpenFOAM, Julia, etc

💡 A deeper observation:

The biggest shift is not Windows → Linux.

It is “Software as product” → “Computation as science and infrastructure.”

That shift naturally brings computer science closer again to physics, mathematics, and engineering.

Thursday, March 12, 2026

From My Computer to This PC - you will own nothing and be happy - the rise and fall of Windows PC...


1. The Era of “My Computer” (1980s–2000s)

The early PC era was about personal ownership and control.

Companies like

  • Microsoft

  • IBM

  • Intel

created a system where:

  • You bought hardware

  • You installed software locally

  • Your files lived on your machine

Operating systems like:

  • Windows 95

  • Windows XP

reinforced the concept of “My Computer.”

You had:

  • Local control

  • Offline capability

  • Permanent ownership of software licenses

This was the golden age of personal computing sovereignty.

2. The Shift Begins (2010s)

The model started changing.

Major shifts:

Cloud Computing

Platforms like:

  • Microsoft Azure

  • Google Drive

  • Dropbox

moved data from personal machines to remote servers.

Software Subscriptions

Traditional purchase → subscription model

Example:

  • Microsoft Office → Microsoft 365

You no longer own the software — you rent it.

3. The Rise of Platform Lock-In

Modern ecosystems increasingly control the user environment.

Examples:

  • Windows 11 requiring Microsoft accounts

  • Cloud-based authentication

  • Telemetry and data collection

The PC becomes less independent and more connected to corporate infrastructure.

4. The New Model: “Your Computer Is a Terminal”

The trend now is toward:

  • Cloud desktops

  • Streaming applications

  • Web-based software

Examples:

  • Windows 365 (Cloud PC)

  • Google ChromeOS

In these systems:

  • Your apps run in the cloud

  • Your data lives on company servers

  • Your device becomes just an access terminal

5. The Counter-Movement

Many technologists push back with open and local computing.

Alternatives include:

  • Linux

  • FreeCAD

  • Blender

These emphasize:

  • Local ownership

  • Open source transparency

  • Offline capability

Interestingly, my own interest in FreeCAD, OpenGL, and simulation sits squarely in this “sovereign computing” movement.

6. The Big Question

The future may split into two computing worlds:

Consumer world

  • Cloud apps

  • Subscriptions

  • Locked ecosystems

Engineering / research world

  • Local computing power

  • Open software

  • Full system control

High-end engineering (CFD, simulation, graphics) still needs local compute sovereignty.

In short:

The PC is evolving from “my computer” → “their platform.”

But in domains like simulation, graphics, and scientific computing, the traditional power-user PC is far from dead.

Sunday, March 8, 2026

Usefulness of Julia in engineering syllabus - we must include it in the engineering curriculum...

 Hey guys... today I want to write about the basic hindrance of engineering college studies. You know the problem - modern-day industries want engineers who can code, who are right at the juncture of engineering and software. Engineering education is beyond just a few theories, complex mathematical formulae, and examinations to test you on such areas. It's also about visualizing and simulating the results of such theories.

So far, the system in engineering colleges is somewhat like

  • MATLAB for modeling

  • Python for data

  • C++ for performance

  • Simulink for block diagrams

And now comes the Julia... it collapses all these layers.

You can:

  • Write high-level control logic

  • Drop to numerical linear algebra

  • Move into differential equation solvers

  • Even implement custom integrators

All in one language...

With sophisticated libraries in Julia, like differentialequations.jl and many such, the engineers have got the right tool for modelling and visualizing the maths and engineering problems...

Let's welcome Julia to the engineering colleges.

Today I was playing around with my college days' engineering education, namely RLC circuits, and found how Julia can transform it through intuitive visualization of the output - a damped waveform. We can visualize the output waveform by varying R, L, and C, yielding a powerful visual tool for a better understanding of basic circuitry theorems.

Here's the video of today's Julia experimentation - RLC series circuit...


Fig 1: RLC series circuit visualization

And here's the simulation of an RLC parallel circuit.

 

Fig 2: RLC parallel circuit simulation

I hope Julia transforms itself from a niche area of scientific modelling and simulation tools into a larger ecosystem of software - SciML is already happening...