Most restaurant groups think a cyber attack happens to other people. It happens to airlines and hospitals and the sort of household-name retailers that end up on the front page of the Financial Times. It does not happen to a regional bistro chain, a single-site fine dining room in Mayfair, or a 14-cover neighbourhood Italian. Until it does. I have sat across the table from operators in all three of those categories in the last eighteen months, and the one thing they had in common was that none of them thought they were a target until the morning their tills stopped working.

This piece is not designed to scare anyone. It is designed to make the abstract concrete. If you have only ever read about cyber incidents in industry press, the language tends to be vague - “a sophisticated threat actor”, “an advanced persistent threat”, “encrypted systems”. What I want to do here is walk you through what one of these incidents actually looks like from the inside, hour by hour, so that you can make better decisions about what to invest in before it happens to you.

Day minus 21: the email nobody noticed

Almost every hospitality breach I have been involved in started weeks before anybody knew about it. In our worked example, it begins with a perfectly ordinary email arriving in the inbox of an accounts assistant in head office - the kind of message that proper email security filtering would have flagged. The sender appears to be a known supplier - a linen company, perhaps, or a wine importer - and the message references a real invoice number from a real recent delivery. The attacker has either compromised the supplier already or has been quietly reading their email for a fortnight. The link in the email leads to a Microsoft 365 login page that looks identical to the real thing. The assistant types her credentials. Nothing visibly happens. She moves on with her day.

What has actually happened is that her username, password, and session token have been captured. Within minutes, the attacker logs in as her from a residential IP address in the UK that does not trip any obvious alarms. They do not do anything dramatic. They read her mail. They look at her calendar. They quietly add a forwarding rule that copies anything mentioning “invoice”, “BACS”, or “bank details” to an external address. Then they wait.

Day minus 14: lateral movement

A week or two later, the attacker has read enough internal correspondence to understand the shape of the business. They know who the finance director is, which sites are which, which POS vendor you use, and what your maintenance windows look like. They use the assistant’s account to send a message to IT requesting a password reset for a shared service account, citing a plausible reason. Or they exploit the fact that the assistant has been granted broader SharePoint access than she really needs and find a spreadsheet of saved credentials. Or they spot that one of your sites still has a Remote Desktop port open to the internet and try the credentials there.

In a flat, unsegmented network - which is still depressingly common in hospitality - once they are inside one machine, they can usually reach every other machine. POS terminals, back-office PCs, the CCTV NVR, the kitchen display screens, the manager’s laptop. Our managed network practice exists precisely because this lateral movement is where small incidents become catastrophic ones, and segmentation is the single most effective architectural control against it.

Hour zero: detonation

On a Friday evening - attackers love Friday evenings, because your IT support is thinner and your revenue exposure is higher - the payload fires. In one common pattern, ransomware encrypts every file it can reach across the estate. Tills go down. The KDS in the kitchen goes black. The reservations system stops responding. Card terminals, if they are integrated with the POS, stop taking payments. A note appears on screens demanding payment in cryptocurrency.

In another pattern, the attacker is quieter. They have spent weeks siphoning card data from a vulnerable POS integration or harvesting guest records from your CRM, and there is no dramatic moment at all. You only find out months later when your acquirer rings to say a common point of purchase analysis has flagged your sites as the source of fraud on thousands of cards.

Either way, the clock has now started. And how the next sixty minutes go will determine how much of this you survive intact.

The first hour: what to do, and what not to do

The instinct, when tills stop working on a Friday night, is to start rebooting things. Do not. The single most damaging thing an untrained team can do in the opening minutes of an incident is to power machines off, wipe them, or “try a fresh install to get trading again”. You destroy the forensic evidence that you and your insurer will need, and in many cases you accelerate the spread because the malware is designed to re-trigger on reboot.

What you should do instead is isolate, not switch off. Disconnect affected machines from the network - pull the ethernet cable, disable the WiFi - but leave them powered on. Disconnect the affected sites from the wider corporate network if you can. Call your incident response provider before you call anybody else, because the decisions made in the first hour shape every decision that follows. If you have cyber insurance, your policy will almost certainly require you to call their nominated responder, and calling anyone else first can void the cover.

Then, and only then, start your communications: ops director, finance director, CEO, DPO. Do not post anything publicly. Do not email staff at large with details. Assume the attacker is reading your mail, because they very probably are.

The 72-hour clock

Under UK GDPR, if personal data has been compromised in a way that poses a risk to individuals, you have 72 hours from becoming aware of the breach to notify the Information Commissioner’s Office. That clock does not pause for the weekend. It does not pause while you decide whether you are sure. The threshold is “becoming aware”, not “having finished the investigation”.

In parallel, if card data is in scope, your acquirer needs to be notified under your merchant agreement, usually within 24 hours. They will almost certainly mandate a PCI Forensic Investigator - a specialist firm from a short list - to examine the cardholder data environment. You do not get to choose whether this happens. You get to choose how prepared you are when it does.

This is why incident response is not a thing you organise on the night. It is a thing you organise on a sunny Tuesday, when nothing is on fire, with a written plan, a phone tree, named decision-makers, pre-agreed legal counsel, and a relationship with a responder who already knows your environment.

The days and weeks that follow

Recovery from a serious incident is rarely the clean, two-day affair that operators imagine. In the cases I have worked on, sites are typically taking cash only for somewhere between 48 hours and a week. Reservations systems take days to rebuild from clean backups, assuming the backups themselves have not been compromised - and a depressing percentage of the time, they have. Loyalty databases need to be reviewed line by line. The PCI investigation runs for weeks. ICO correspondence runs for months.

Then come the second-order effects. Trade press coverage, if you are large enough to attract it. Awkward conversations with landlords and franchise partners. A measurable dip in covers for the sites that were named, lasting somewhere between three and nine months in the cases I have seen. Insurance premiums up at renewal - sometimes doubled, sometimes refused outright. And, in the worst cases, regulatory fines that arrive eighteen months after the event, when everyone has tried to move on.

None of this is intended to alarm. It is intended to make the maths honest. The cost of preventing a serious incident is, in almost every case I have looked at, an order of magnitude smaller than the cost of recovering from one.

How you actually prevent most of this

The good news is that the overwhelming majority of hospitality breaches are not the work of national-state actors using zero-day exploits. They are opportunistic, and they exploit the same handful of weaknesses every time. Close those weaknesses and you are no longer worth the attacker’s time - they move on to a softer target.

The controls that matter most are not glamorous. Multi-factor authentication on every account that touches email, finance, or admin systems, with no exceptions for “the boss who hates it”. A patch management discipline that means POS terminals, back-office PCs, and network equipment are kept current, not running an end-of-life Windows build because nobody wants to touch the integration. Network segmentation, so that the back-office network, the POS network, the guest WiFi, and the CCTV are properly separated from each other. Endpoint detection and response on every endpoint, not just antivirus that hasn’t been updated since the device was unboxed. Cyber Essentials certification as a baseline, not a ceiling. Regular phishing simulations and short, plain-spoken staff training. And monitoring - actual human or hybrid monitoring of the alerts that all of this generates, because controls without monitoring are a smoke alarm with the batteries out.

That is what our hospitality IT support practice does day in and day out: keep the boring fundamentals in working order so that the dramatic incidents do not happen in the first place.

The CloudMatters position

We approach cyber security as layered defence. No single control stops every attack. The job is to make sure that when one layer is bypassed - and one layer is always eventually bypassed - the next one catches it, and the one after that, and the one after that. Our SOC monitors 24/7 because attackers do not work office hours. We rehearse incident response with our clients because the night of the incident is a poor time to read the plan for the first time. And we are honest with operators about what their current posture actually looks like, because flattery does not protect a single till.

If you have read this far and you are not sure where your group stands, that uncertainty is itself the answer. The operators I have seen come through incidents in the best shape were the ones who had asked the question early, taken an honest look at the answer, and treated cyber security as an operational discipline rather than an IT line item. That is a conversation worth having before the Friday night that turns into a long weekend.

If you would like to start that conversation, our cyber security team would be glad to take you through what a layered defence looks like for a hospitality estate of your size and shape. Better to have the meeting now than the post-mortem later.