Profit Margins & Real Revenue in Adobe Analytics: Part I

Erasing the Blind Spot of Marketing Measurement

— Also see Part II: The Technology behind it

“Just half of what people ordered in the online shop actually turned into true bottom-line revenue (= the customer really paid for it)? So the data in Analytics is wrong? And aren’t we also sending false success signals to the Google Shopping and Facebook algorithms? Is it due to fraud? Or is it that customers buy products that are out of stock and orders have to be cancelled? And what? The 50% left were mostly products with a horrible profit margin?”

That was more or less the situation I had at a big retailer. To get one thing clear up front, this article is NOT about tracking errors or missed Conversions due to Ad Blockers etc. (for this, see my article on Ad Blockers)! It is about the fact that even correctly “tracked” Revenue is often far, far away from the “actual”, i.e. successfully invoiced Revenue. Depending on factors to be discussed below, it can often be 50% off and more. Moreover, this gap is not constantly at about e.g. 10%, but it fluctuates heavily depending on assortment category, campaign, offered payment methods, and stock level issues. And of course: Corona!

In the following, I will use “Tracked Revenue” to refer to Revenue as tracked on your website (usually through your Tag Management System -> what you see in Analytics tools under “(Product) Revenue”), and “Actual Revenue” to refer to the revenue that actually ends up on the retailer’s bank account.

Marketers have long been convinced that Cost per Revenue (how much do I have to spend to earn 1 Euro?) or ROAS (Return on Ad Spend (how much do I earn for every Euro I spend?)) are the most important KPIs for any Performance Marketing endeavour. But in any Advertising or Analytics solution out there, the ROAS is merely based on the Revenue that is tracked on the website where the purchase happens. That is often a giant oversight. I call it the “blind spot of Digital Marketing”.

A random example: Campaign Category 1 and 4 look like their ROAS was a lot better than Category 2. But it is actually worse if you look at the bottom line.

Advertising giants like Google or Facebook also have done nothing to make you think otherwise — because that blind spot is out of reach for them. Of course, their Attribution models are also based on the very same “tracked” Revenue. You remember why image recognition software identified dark-skinned people as gorillas? Because the machines behind that software were fed with skewed data in the first place. Same happens to your Attribution models in Google Analytics or Google Ads.

If you read the “Checklist on Comparing Apples to Apples”, this won’t surprise you: The Revenue tracked into your Analytics tool, even when implemented flawlessly and circumventing Ad Blockers, is rarely ever the “real” Revenue that counts towards the bottom line. And if your urge now is to stop reading and say “Well of course, that is trivial, but it is about 10–20% off in good tracking setups due to Ad Blockers, so I can still use Revenue as a proxy to reality”, I would beg you to read on for a bit at least. Because in many cases, that is not true!

Actual Revenue divided by Tracked Revenue for several Product Categories for a random period in early 2020 (before Corona): Tracked Revenue is often a misleading compass.

So what happened to this client that kept feeding Google Ads the Revenue which their ads supposedly generated? The algorithm of course increased the bids for those same ads, and more and more traffic and “Revenue” were generated. So the Cost per Tracked Revenue kept sinking, while the Cost per Actual Revenue (which was invisible in Analytics at that time) rose and rose.

When they realized how little of that Tracked Revenue ended up getting really invoiced, they stopped multiple Google Shopping and Facebook campaigns.

Digital Analytics aka “Web Analytics” Tools like Google Analytics or Adobe Analytics are good for analyzing what happens on your website or app. They group traffic into Marketing Channels, show you Click and Conversion Rates for your on-site search, at what step people exit your checkout or any other funnel, or how much Revenue is generated by teasers and other clickable on-site elements.

But it gets complicated when you want to feed these tools with data that is not generated on a website or app (e.g. a cancellation of an order) and then merge it into an existing Visitor profile. But it is possible. And the benefit is huge for this client to have Actual Revenue and Profit Margins in Adobe Analytics: It is the place where Category Managers and Marketing staff already do their analyses, the data is connected to everything else Visitors do (Profit Margin per On-Site Search Term or per Teaser on the Homepage? -> no problem!), and they can analyze the data with what is probably the best ad-hoc analysis and reporting interface for non-nerds, Adobe’s “Analysis Workspace”.

That being said, if you do not use Adobe Analytics or prefer to merge all that data in another place and have a lot of great data engineers at your disposal, you can also go for that. The best strategy depends on your use case. My message is simply: You have to become aware of Profit Margins instead of just Revenue, and of these often enormous gaps between Tracked and Actual Revenue.

In the case of the client at hand, they were desperate to see their campaign performance not only in terms of Return on Ad Spend (ROAS). This became even more important when they started a test on Google Shopping Ads bidding based on profit margin instead of Tracked Revenue. The expectation was that the Revenue would go down, but therefore the Margin (=Profit) and thus the bottom line would rise.

Since Google claims that with this method it would push ads for high-margin products more, using ROAS to compare “profit-driven” campaigns with “ROAS-driven” campaigns made no sense. The only true currency here is the Margin-to-Ad-Spend Ratio.

“Margin / Ad Cost” vs. “Revenue / Ad Cost (ROAS)”: What is efficient ROAS-wise does not mean it is also efficient profit-wise

So using a Revenue-based metric like ROAS to assess campaigns would be completely misleading. However, assessing such campaigns only by the expected margins at purchase time also meant excluding factors like the rate of people buying, but not paying (see Actual vs. Tracked Revenue above).

This meant we needed both the Actual (bottom line) Revenue and the (bottom line) Margins into Analytics. For this, we are using the most underrated feature in Adobe Analytics, the great old Transaction ID-based Data Sources. More details on the “how” will be explained in Part II.

What makes “Tracked Revenue” less useful

At the client, the following problems prevented them from trusting their Adobe Analytics Revenue data:

In Switzerland, the most popular payment method is invoice (aka “bill”). This method allows you to go through the checkout without sharing credit card information and pay the invoice by wire transfer from your bank account within e.g. 14 days.

Right column shows % of actual Revenue compared to tracked Revenue (left column) by payment method chosen. red = low rate.
Right column shows % of “Actual” divided by “Tracked” Product Revenue (left column), grouped by payment method chosen. Red = low rate. green = high rate.

If the client chooses “invoice” and is deemed solvent according to an after-the-order financial check, she gets her products delivered right away. If not, she is relegated to the prepayment method where nothing gets delivered until she has effectively wired the money.

This leads to the following: Many people, while checking out, claim that they are going to pay by invoice, but they never end up really paying. Especially in high-priced “status symbol” segments like smartphones or notebooks, this sometimes fraudulent pattern is common. This also means that the tendency for this pattern is not the same across all assortments and campaigns. Which in turn means you cannot simply say “We have to subtract about x % of the tracked Revenue to get an approximation to the Actual Revenue”. You have to constantly monitor it!

11 Product Categories, number 3 compared by payment method

See the example above which shows 11 Product Categories:

  • Column 1 (“Actual Revenue” divided by “(Tracked) Product Revenue”) shows how vastly the difference between Actual / Tracked Revenue can fluctuate: Between 55.8% for Category 3 (=only a bit more than half the tracked revenue is actually paid!!) to 95.5%).
  • Column 2 (% Margin on Actual Revenue) shows the bottom-line Profit Margin per Category
  • Column 3 (% Margin on Tracked Revenue) shows the bottom-line Margin compared to the Tracked Revenue.

As mentioned, in Category 3 only 55.8% of the Tracked Revenue ends up actually getting paid. When we look at the Payment Methods, we see that e.g.

  • bill” (invoice) which usually has a decent rate, gets only 54.4% here. So in Category 3, people who are pretending to pay by invoice very often do not pay after all.
  • Even worse are “instalment” and “prepayment”. However, when looking at the actual margins, we see that prepayment, for those 36% of Revenue that effectively gets paid, still has a very good Margin on Actual Revenue of 22% — which we would not have realized if we had compared the margin only to the Tracked Revenue (8.2%).

All my E-Commerce clients have stock issues. That means that e.g. their shop or worse their campaigns still show products that actually cannot be sold (anymore) — a huge waste of Advertising Budget and hard to truly conquer with all the messy and often failing product feeds around. This retailer also offers clients to “pre”-order products that are currently out of stock and then delivers when back in stock. But this leads to major problems when demand for certain assortments increases unexpectedly.

The problem existed already last year during the record-hot June and July. It led to swaths of people in Switzerland buying an air conditioner for the first time ever (I care about the climate, so I didn’t). For weeks, you could not get ACs nor fans anywhere.

Similarly, with the Corona pandemic, demand e.g. for sanitizers, breathing masks, toilet paper, home office equipment (and even puzzles) spiked so dramatically that many orders had to be cancelled because there was simply no chance to deliver these items anytime soon. Or people were told that the things would be delivered hopefully in some weeks, which in turn led to many people cancelling their orders.

So again, the client was buying Google Ads which on the surface performed marvellously well because so much “Tracked Revenue” came in, but that was just an illusion when the “Actual Revenue” was considered. As the gap to “Tracked Revenue” comes to surface only a couple of days later, I absolutely urge you to track Stock Status into your Analytics! It is usually a good predictor for Product Conversion Rates (Orders / Product Detail Visits) and sometimes (not always!) a good proxy for Revenue that is lost later on because you cannot deliver:

Tracking Out of Stock Status for can sometimes predict a low percentage of Revenue actually getting paid (=> products cannot be delivered)

Out-of-Stock Traffic, together with Error (404) Pages and “Product Not Found” Traffic (a special error page for products which have gone offline recently) is thus also one of the things I always want to monitor with Alerts:

Must-have: Alerts for Stock Issues and other indicators of wasted traffic

Add to those the normal every-day stuff, e.g.

  • people call or mail in to customer service to cancel their order or change the number of units (because they e.g. accidentally ordered two, but only need one)
  • a customer agent calling the client that product A may be a better option than B, and that he can change the order to product A if he wishes
  • business clients getting an additional discount after-the-fact after a chat with their account manager
  • bundles which are one product in the shop being split up into several products on the backend side etc.

All these things are usually minor in terms of their impact.

And, of course: Product Returns aka “Refunds”. I wrote about returns and cancellations already here some time ago. Suffice to say that Returns are not considered for this article. Returns should be looked at separately because, when we have a Return, the initial order was paid and delivered. So the order is not simply cancelled, and availability or solvency have no impact on Return Rates. But of course monitoring your Return rates is crucial as well!

Examples by Marketing Channel: Mind the Margin!

Let me close with some more Online-Marketing-focused examples that are more proof of the often vast differences between Tracked and Actual Revenue, as well as how important it is to look at Margins instead of only Revenue (= what 95% of Online Marketers are ignoring).

Look how Return on Ad Spend changes from Channel to Channel when comparing Tracked to Actual Revenue: The 4th channel gets about 45% less efficient!

Tracked and Actual (Bottom-Line) Revenue even differs vastly by Marketing Channel!

The reason here were some Social Media campaigns which advertised some attractive high-status products to audiences which purchased, but never paid them.

We combine Channel and Product Categories in the next and final example, and it is a pity that I can only show some relative numbers, but it gives you an outline of the metrics you should look at for your Marketing Performance Analysis. It shows Marketing Channels, and Channel 1 and Channel 3 are broken down by Product Categories:

Marketing Channel Performance Metrics, Channel 1 and 3 broken down by Product Categories

The Metrics in detail:

  • Actual / Tracked Revenue: How much of what we track ends up getting paid
  • % Margin on Actual Revenue: Profit Margin / Actual Revenue (e.g. if you earn 10 EUR for 100 EUR sold, the % Margin is 10%)
  • Actual Margin / Ad Cost (“Margin ROAS”): The Margin divided by the Cost. Effectively shows how much remains from your Margin after you subtract Marketing Costs! => closest thing you can get to actual bottom-line profit of a Marketing Campaign!
  • Tracked ROAS: What you usually get in your Advertising or Web Analytics solution: Tracked Revenue / Ad Cost
  • Actual ROAS: Actual Revenue / Ad Cost -> counter metric for Tracked ROAS (see orange bar graph above the table)
  • % Delta Tracked / Actual ROAS: (Actual ROAS — Tracked ROAS)/Tracked ROAS -> How much does the ROAS go up or (usually) down if we look at the Actual instead of the Tracked Revenue? See e.g. Channel 3 which loses 24.4% on average, but only 8.5% for Category 2.
  • Ad Cost / Actual Margin: How much do I have to spend for 1 EUR of Profit (inverted “Margin ROAS”)?

So I hope this piece helped you understand the problem behind the “blind spot in Online Marketing”. Look out for part 2 on how to get the Profit Margin and Actual Revenue into Adobe Analytics.

You may wonder why, in the examples, the “Actual Revenue” is almost always LESS than the Tracked Revenue, when you instinctively would guess that Analytics should have LESS Revenue because of Ad Blockers etc. First of all, we use a server-side tracking method at this client where Ad Blockers are not a problem. But more importantly, “Actual Revenue” is ONLY imported for the Transaction IDs that were also tracked into Analytics in the first place. So if Analytics never tracked some Transaction (Order), it will also not get updated info on that order. This is important because …

  • Transactions that were never in Analytics cannot be attributed to anything (no search term, no campaign, no user cookie)
  • but most importantly, we this way also circumvent the risk of offline or giant API Orders accidentally messing up the “Actual Revenue” data for Analytics.

— Also see Part II: The Technology behind Campaign Cost, Profit Margins and Actual Revenue in Analytics

--

--

--

Digital Analytics Expert. Owner of dim28.ch. Creator of the Adobe Analytics Component Manager for Google Sheets: https://bit.ly/component-manager

Love podcasts or audiobooks? Learn on the go with our new app.

Webinar: Winning Hearts & Wallets

Implicit Attitudes in Marketing Research

How To Make Money With Hate Comments

(update 2)Five China Marketing Predictions for 2020

Your Complete Guide to Using Social Proof in Marketing

Ask yourself these 5 questions about the 5 paths to content creation

How to export whatsapp group contacts to an excel file

The Digital Marketing Firm, Avramify, is aiding Entrepreneurs Worldwide

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Lukas Oldenburg

Lukas Oldenburg

Digital Analytics Expert. Owner of dim28.ch. Creator of the Adobe Analytics Component Manager for Google Sheets: https://bit.ly/component-manager

More from Medium

Creating Customer Value

Track SPA #(hash) URL changes in the Google Analytics GA4 using Google Tag Manager only

Why You Need a Database (and where to start)

YouTube Analytics: best practices, metrics and tools