(See the first part of this article here for background.)
How, exactly, does AMS assign the lagged sales data?
I've established, via occasional spot checks during my tests in the previous article, that it takes several days for the Orders (Units Sold) data to finalize for the AMS-reported sales.
I was asked if it trickles in randomly, or if it is assigned to the particular days that it should have been applied to originally. Since I didn't know the answer, I decided to find out.
I also now had a corrective for my “organic sales = effectively zero” assumption. 3 weeks of testing in August established a baseline for my 2 advertised products (1st book in series for 2 series (ANNWN-N1 and CHAIN-N1) of 0.2 units per title per day.
So, I used the July 1-July 29 test period (which is now over a month old) and took a look.
First, I used the AMS reports to do a day by day look at the reported units sold. I compared that with my spot check during the month of sales as-of a particular date — that ran out well past the end of the month to let the lagging data catch up.
I discovered that the lagged data is assigned after the fact to particular days, presumably the dates that would have been correct at the time. In other words, my spot checks of units sold as of date X for the month are smaller than they would be if I could do that now, because lagging data has been applied to those dates sometime between when I did the spot check and now, a month later.
Undercounting & Missing data
So, now that the data is final, what does the undercounting look like?
There are several interesting things here. The blue line is the reported KDP units, and I “corrected” them in the gray line with my new organic sales algorithm (0.2 units/day/title).
You can see that there is a significant undercount, even with KDP correction, over this month-long period. The AMS count of reported units was 15, and the KDP count was 36 (or 24, corrected for estimated organic sales). Any way you look at it, that's a severe undercount.
Now, look at the stretch of zero reported AMS units sold from day 8 to day 16 (red arrows), while KDP reports ordinary numbers of sales. That screams to me of “missing data” on the AMS side.
All my data processing background makes me believe that the AMS reporting system is an aggregate of several background systems, artificially yoked together, as evidenced by the drift of primary data for Impressions after the fact as well as by all of this lag/undercount suspect data for units sold. Any end system which is dependent on other systems will be fragile, and a chunk of 8 days of missing units-sold data would not surprise me as the result of some data processing hiccup anywhere in the chain.
During that period of missing AMS data, KDP reported (uncorrected) 10 units sold. That's a very big chunk of the apparently undercounted data. And I wouldn't have known about it if I hadn't done a day-by-day look at the AMS data well after the fact — my full-month checks wouldn't have turned it up.
How often does this happen? Who knows?
Lagged data behavior
Here are some looks at how the lag in reporting sales units in AMS works.
The green dotted line represents the after-the-fact data for units sold for each day in AMS (at the time, the values would be less). The yellow dotted line represents the 5 measurements I took as-of a date during the month when the test ran (before all the lagged data had been applied to the individual dates), with a smoothing process to link those observations together.
You can see how for day 14, the as-of count for units sold (yellow) was considerably less than the eventual corrected units sold count (green) now visible in the AMS reports for that day. And, of course, the yellow line shows how the as-of date spot check eventually comes down to match the green line after-the fact-correction. Not until about a week later were the AMS reported units sold final.
What's the takeaway from all of this?
The count of units sold in AMS is subject to dependencies on data processing problems (including missing data), significant delays in units-sold reporting, and what seem to be significant undercounts of units sold (in so far as the organic units sold can be pinned down).
You cannot use the count of units sold in AMS (or the derivative calculations such as ACOS) as hard data.
Other shenanigans, from the KDP Sales reporting systems
Though this is not directly related to AMS Ads, you need to be aware of the limitations in using Amazon's sales tracking systems (and which ones) when attempting to compare unit sales to AMS reported orders.
There are 5 ways to track your unit sales outside of AMS, so we need to define our terms.
The “Old Sales Dashboard” offers 3 different takes which do not necessarily match eachother. The “New Sales Dashboard” offers a 4th variant, and the Royalty Report from the “Old Sales Dashboard” is the ultimate authority, since this is what the KDP payments to us are based on.
In brief, the units reported by each of these eventually match up, but the dates reported do not, and that can cause inconsistencies when you're doing analysis vs AMS Ad Orders, especially across month-end boundaries.
This mismatch of the dates is another indication to me that there are at least 3 feeder systems going into these reporting systems.
Here's an example of each system.
All of these examples cover the same date period (8/30 thru 9/3) and illustrate the discrepancies between when sales are recorded.
For convenience, the book abbreviations are:
ANNWN-N1 (To Carry the Horn)
ANNWN-N2 (The Ways of Winter)
CHAIN-N2 (Mistress of Animals)
CHAIN-N3 (Broken Devices)
System 1 – the Old Sales Dashboard Royalty Report (how KDP pays) claims:
(8/30 – 0 sales)
(8/31 – 4 CHAIN-N2)
9/1 – 2 ANNWN-N1
9/2 – 1 ANNWN-N2
9/3 – 1 CHAIN-N3
System 2 – the Old Sales Dashboard claims:
(8/30 – 2 units)
(8/31 – 4 units)
9/1 – 0 units
9/2 – 2 units
9/3 – 0 units
System 3 – the Old Sales Dashboard (date range = month-to-date) claims:
9/1 – 0 unit
9/2 – 2 units
9/3 – 0 units
System 4 – The Old Sales Dashboard high-level month-to-date tab report claims:
As of 9/1 (total) – 1 ANNWN-N1
As of 9/2 (total) – 2 ANNWN-N1
As of 9/3 (total) – 2 ANNWN-N1, 1 ANNWN-N2, 1 CHAIN-N3
System 5 – The New Sales Dashboard claims:
(8/30 – 2 CHAIN-N2)
(8/31 – 2 ANNWN-N1, 2 CHAIN-N2)
9/2 – 1 ANNWN-N2, 1 CHAIN-N3
A) There are 3 feeder systems and they eventually reconcile in terms of units, but not in terms of recorded dates — this becomes particularly obvious (if you look for it) across month end boundaries. (This impacts monthly statistics when testing in small quantities, which is how it came to my attention.)
B) None of them exactly match eachother on a day-by-day basis, including across month-end.
C) If you're going to do detailed tests, you either have to use the ultimate authority (System 1 – the KDP Royalty Report) or pick one of the other systems and stick to it. Naturally, the KDP Royalty Report (from the old sales dashboard) is the most inconvenient tool to track with.