
![]()
It started like any other Tuesday. The finance inbox was busy, the approvals queue was longer than usual, and a familiar vendor name appeared with a polite reminder. The message referenced a real purchase order, mirrored the vendor’s tone, and attached a letter on letterhead asking us to update bank details “as advised by the auditor.” The domain looked right at a glance. Two people approved the change. Funds moved the next day. A week later, the real vendor called to ask why their invoice hadn’t been paid.
That single decision treating an email as proof cost the company ₹25 lakh in under seven days. I’ve spent years helping Indian SMBs harden Microsoft 365 and Google Workspace, and I’ve built and marketed API-first email security designed for exactly this kind of attack. The thing that keeps me up at night isn’t malware; it’s moments like this: busy people, credible language, and a lookalike domain that slides past routine defenses. This is the story of what actually went wrong, how we untangled it, and the playbook that stops it from happening again.
Related Read - 1 in 99 Email is Phish
How a Good Team Still Lost ₹25 Lakh
The attacker didn’t need a zero-day. They needed a realistic script and a domain that could fool a tired pair of eyes. They copied fragments of an older thread, kept the salutations consistent, and used a PDF to create a sense of formality. Nothing in the email screamed “scam.” There was no misspelling, no crude threat, no cartoonish urgency just a grown-up tone with a neat sign-off and “please confirm once done.”
The company had controls. Payments over a threshold needed two approvals. The problem was that both approvers drew confidence from the same poisoned source: the email thread itself. Nobody picked up the phone to call a known contact saved in the CRM. Nobody noticed that the domain swapped a single character. Nobody questioned why a banking change arrived by email instead of the vendor portal.
When the vendor called a week later, the blood ran cold. The bank tried to recall the transfer, but the funds had already hopped through intermediary accounts. The team still had to pay the real vendor. Cash flow tightened. Projects slipped. Everyone worked late for a month. Most painful of all, confidence inside the company took a hit. People who were good at their jobs felt foolish, and that’s a hard feeling to shake.
The Real Lessons (The Ones That Stick)
The first lesson is simple: email is not an authoritative system of record. It’s a conversation layer, not a control layer. When money is on the line, you need a second channel that cannot be forged by a well-written message.
The second lesson is that identity signals matter, but only if they’re enforced. DMARC reports sitting in a mailbox won’t protect you. Alignment that isn’t pushed to reject won’t stop lookalikes. Banners that show up on every single external email become wallpaper; people stop seeing them. The cues have to be timely and contextual “first-time sender,” “newly registered domain,” “banking details mentioned” and they have to be rare enough that staff don’t tune them out.
The third lesson is cultural. Teams do what leaders normalize. If leadership makes it acceptable even expected to slow down, pick up the phone, and verify a banking change, the habit spreads. If leaders only celebrate speed, people cut corners on the exact days they shouldn’t.
What We Changed After the Incident
We didn’t buy five new tools. We rewired habits and tightened a few decisive controls.
First, we wrote a one-page rule that sits on every Accounts desk: any change to beneficiary details must be verified by phone using a number already in the CRM, not the number in the email signature. No exceptions. No “just this once.” If the vendor insists on email only, the payment waits.
Second, we moved DMARC from report-only to enforcement. That meant fixing alignment properly SPF where it made sense, DKIM where it mattered and then pushing to reject once the legitimate senders were clean. It took a few weeks of coordination with marketing and sales systems, but the payoff is permanent.
Third, we gave people better signals at the moment of risk. If an email mentions banking changes, the banner doesn’t scream; it nudges: “This message mentions beneficiary updates. Verify by phone using a saved contact.” If a domain is a day old or a character off a known vendor, the email is quarantined for a human to check, not dumped into a spam pile no one reads.
Finally, we ran a tabletop exercise with Finance. No scare talk just a realistic walk-through of the exact scenario that burned us. We practiced what to say to a vendor, how to log the verification call, how to escalate a suspicious request, and how to stop a transfer in flight. After one practice, confidence went up and response time dropped.
A Simple, Durable Way to Run Finance Without Fear
![]()
Think of this as a service quality issue, not a tech project. You want approvals that stand up to a bad day. Approvals must be independent. If Approver A reads the email, Approver B cannot rely on that same email as proof. B needs a CRM-logged call or a portal record.
Bank changes need a cooling-off period. New beneficiaries or updated accounts don’t move money instantly. A short delay with an alert to Finance leadership beats a week of panic later.
Your vendor master file is gold. Keep verified contacts, canonical domains, and the last verified date. It turns “we think we know them” into “we just called them.”
You’ll notice none of this requires heroics. It requires a page of clear text, ten minutes of training, and leadership willing to back a slower, safer way of working.
Technology That Helps Without Getting in the Way
When you harden Microsoft 365 or Google Workspace, aim for quiet, high-signal controls. Enforce modern authentication and kill legacy protocols you don’t use. Alert on new inbox rules and new OAuth app grants. Reserve quarantine for the riskiest patterns new domains asking for money, display-name mismatches on known vendors, and first-time senders discussing invoices or settlement.
Most important: don’t drown your team in alerts. Teach the system to say less, but say it at the right moment.
If You’re Reading This After a Close Call
Call your bank now and ask for a hold or recall. Preserve the evidence full headers, approvals, attachments, and logs before anyone starts forwarding emails or editing notes. Tell the real vendor what happened and agree on a plan so the relationship survives. If you carry cyber insurance, notify them early. Then sit with your team and write the one-page rule. Print it. Tape it to the desk. Live by it.
Why Indian SMBs Keep Getting Hit and How to Flip the Script
We move fast. We prize responsiveness. We do more with less. Those are strengths, but they set the stage for Business Email Compromise because attackers don’t need to break anything; they only need to borrow our rhythm. The antidote isn’t paranoia. It’s a tiny dose of friction at the right moments: a phone call before a bank change, a day’s delay before a new beneficiary is used, a banner that nudges instead of nags.
In my own work building and deploying API-driven email security for Microsoft 365 and Gmail, the biggest improvements came from these small, human-centred changes. Once teams saw that the “extra step” took two minutes and saved twenty-five lakh, nobody argued for the old way again.
What to Do This Week
You don’t need a grand program. You need three moves:
-
Write the bank-details rule and start using it today.
-
Set DMARC to enforcement after aligning the legitimate senders.
-
Run one practice scenario with Finance so the first time isn’t during a crisis.
-
If you do only that, you’ve already broken the attacker’s playbook.


