fbpx Award Confetti logo-bs logo-dxn logo-odl Magento Piggy Bank 09AEAE68-D07E-4D40-8D42-8F832C1A04EC 79C8C7E9-0D9D-48AB-B03B-2589EFEE9380 1A734D92-E752-46DD-AA03-14CE6F5DAD57 E622E2D4-3B9C-4211-8FC3-A1CE90B7DFB2 Group 19
The Breakthrough Agency.

Your customers are teaching you

For ecommerce teams who’d rather learn from peak than survive it.

Peak is more than a conversion spike. It is a live usability test for your brand – and your customers are writing the research report in real time.

When they are rushed, time-poor, and spoilt for choice, they stop being polite and start being honest. Every bounce, complaint, and support ticket is a reaction to what you actually put in front of them – not the version you thought you had.

This is Part 3 of a six-part series on using peak as a masterclass, not a crisis.

Now we stay with the same discipline, but change the view. This time, you are looking at the stress test through your customer’s eyes.

Peak is your biggest live usability test

During peak, your customers behave differently:

  • They are goal-driven and impatient.
  • They compare more brands, in less time.
  • They have no interest in learning how your site works.

That is exactly why this period is so valuable. Small irritations that people would forgive in March become hard “no”s in November. Any mismatch between promise and reality is exposed very quickly.

You do not need a lab to see this. You already have:

  • Traffic at volume
  • Emotion at volume
  • Consequences at volume

The question is whether you treat what you are seeing as background noise in analytics or as live research.

Behaviour is feedback, beyond the numbers

Most ecommerce teams have their reporting foundations in place. We spend a lot of time helping leaders move from entry-level dashboards to more useful ones – better views, better breakdowns, better alerts.

By this point, you are probably tracking at least:

  • Overall conversion rate
  • Bounce rate
  • Sessions and channel mix

Those numbers matter, but they only say “something happened”. To learn from peak, you need to understand what people were trying to do, what they actually saw, and how that experience might have felt.

A few examples.

Conversion rate dips while traffic surges The headline view says “performance got worse”. The story might be “our messaging brought in the wrong intent” or “the promise in the ad did not match the landing page”.

Bounce rate spikes on a key landing page The metric says “they left quickly”. The story might be “the first screen did not confirm they were in the right place” or “we opened with what we care about, not what they came for”.

Sessions and support contacts rise together The graph says “busy”. The reality might be “customers cannot access basic information on their own”, so they are forced into chat or email just to complete simple tasks.

What you are really doing here is pairing hard numbers with a human reading:

  • What were they trying to achieve?
  • What did they encounter?
  • What kind of feeling would that create?

When you combine those two layers, behaviour stops being a line on a chart and becomes feedback you can actually act on.

In last week’s article, that promo banner designed on gut feel rather than insight lives in this space. This week is about finding and logging the signals that would have made that decision differently: what your customers actually value, fear, and expect when they show up in peak mode.

Four places your customers are already talking

You do not need a new platform to listen better. You need a sharper lens on what you already have.

1. On-site behaviour

Start with analytics, but read it like a narrative, not a spreadsheet.

Which journeys move cleanly from landing to checkout? Where do people loop, scroll back, or rage-click? Which “important” content blocks are barely touched?

A serious, time-pressured customer does not wander for fun. If they are looping, something is not pulling its weight.

2. Support channels

Support tickets, live chat logs, phone calls, WhatsApp threads – all of them are qualitative research.

Look for:

  • The questions people ask again and again
  • Confusion about options, delivery times, or returns
  • Situations where customers try to do something the site does not make obvious

Any time a person has to explain something that should be clear from the interface or content, you are seeing a design or communication gap.

3. Reviews and social comments

Reviews during peak often carry more emotional weight.

“Arrived exactly when promised” suggests your delivery message landed. “Didn’t arrive in time for the event” shows a gap between the date in your head and the date in theirs. “Code didn’t work and support took ages to reply” points at promotion clarity, tooling, and response time in a single sentence.

On social, ignore the vague “sentiment score” and look for repeated themes:

  • Trust – “I don’t believe this offer.”
  • Time – “I’m not sure this will arrive when I need it.”
  • Risk – “If this goes wrong, I’m stuck.”

Customers will not organise this neatly for you. That is your job – and it only works if you keep examples somewhere you can find later.

4. Post-purchase behaviour

What happens after checkout tells you a lot.

Do customers open your order confirmation? Do they track the parcel once, or three times? Do they get in touch before the delivery window has even closed?

This is a clear window into their anxiety. If people keep checking the same details, they either do not trust them, or cannot remember where to find them.

5. If you want a new tool: watch real sessions

If you add one tool into this mix, make it something like Hotjar, Microsoft Clarity, or another session replay tool.

Sit down and watch real customers use your site:

  • The hesitation as they hover over a vague delivery promise
  • The scroll back up because the first screen did not land
  • The exact moment they bail and leave

At JH, we watch hundreds of these sessions every week, and it is often one of the quickest ways to spot experience issues. Watching someone fight your website is humbling – and it is something decision-makers across the business should see, not just UX or CRO.

An afternoon of real sessions will usually teach you more than weeks of reports.

The Customer Signal Log

In Week 1, you started a simple log of what peak was teaching you – what bent, what broke, what surprised you.

Do not start again from scratch. Evolve that same log.

Add a new tab, section, or block called Customer Signals and give it four simple fields:

  1. Observation – what happened, and where
  2. What this might mean – your current reading
  3. Evidence – link, screenshot, or snippet
  4. Emotion – the dominant feeling behind it

For example:

  • Observation: Live chat spike on Thursday afternoon asking whether “next-day” delivery includes Saturday.
  • What this might mean: Our delivery promise is unclear. Customers are unsure what “next-day” actually lands as, especially close to the weekend.
  • Evidence: Link to a chat transcript, plus an analytics segment showing increased chat use on the checkout page.
  • Emotion: Time / risk – people are worried their order will miss an important date.

You do not need perfect wording. The aim is something your future self can skim in January and instantly remember the situation.

Later in the series – especially in Week 5 – you will bring this Customer Signals section together with your internal friction notes and use both to decide what to fix first. This week, the win is simply capturing enough detail that the moment is not lost.

Bringing C.A.L.M. to customer insight

In Week 2, we introduced C.A.L.M.:

  • Capture
  • Analyse
  • Learn
  • Measure

This week we lean heavily on the first three.

  • Capture – Get customer signals into the log as close to real time as you can. No polishing, no overthinking.
  • Analyse – Light tagging: is this pointing at a design, process, system, or leadership gap? Is the underlying issue trust, time, clarity, or something else?
  • Learn – Add one line: “If we take this seriously, what would we change next year?”

Leave Measure for Week 5, when you are debriefing calmly. Trying to force impact and effort scores now will drag you straight back into firefighting.

Remember last week’s mindset from Friction Reveals Truth: high-performing retailers are not chasing perfection. They are nudging 9/10 experiences towards 10/10. Your customers are doing a good job of pointing at where that extra one-point gain lives – if you let them.

The emotional layer

Analytics answer “what happened?”. Support logs and reviews hint at “why?”. Emotion tells you “how much it mattered”.

As you scan your Customer Signals section, look for tone:

  • Panic – “Will this arrive in time?”
  • Distrust – “You said one thing on the homepage and another in the checkout.”
  • Relief – “Turned up earlier than expected – brilliant.”
  • Disappointment – “Was a gift and now it’s going to be late.”

You do not need a complex taxonomy. Simple tags like trust, time, value, effort, risk are enough.

The point is to make sure that when you are planning 2026, you are designing around how peak felt for buyers as well as how it performed for the business.

This is where differentiation tends to hide. Features can be copied. The way customers feel during high-stakes moments is much harder for competitors to replicate.

This week’s job

Your job for Week 3 is deliberately simple, and it builds on what you have already started:

  • Take the log you set up in Week 1 and add a Customer Signals section to it – same document, new area.
  • Each day, add a handful of real examples – behaviour, what it might mean, and a link or screenshot.
  • Tag each one with:

You are not fixing everything yet. You are banking the truth while it is loud and clear.

By January, the numbers will still be there. The detail and the emotion will not be – unless you capture them now.

Next week, we shift focus again. So far, we have looked at systems and customers. In Week 4 – The Hidden Cost of Heroics & The Slack Channel That Says It All – we will look at your team: who is carrying the weight, what your internal channels really say about your culture, and how leadership behaves when peak bites.

Your customers are teaching you right now. Do not waste the lesson.

JH – The Breakthrough Agency

As an extension of your team, JH partners with Magento and Adobe Commerce brands to treat peak as your biggest usability test – capturing what customers show you under pressure and building those lessons into UX, CRO, and delivery plans that create real breakthroughs rather than simply fuller backlogs.