1. Segments Are Tied To SAINT Classified Dimensions
This is based on a recent issue which Adam faced while deleting a classified column as part of a new SAINT upload where segments mapped to one of the deleted columns stopped working. Below is a verbatim writeup of what Adam sent me and I hope you will find what he shared useful. Thanks Adam!
Segments built using classified variables follow a unique reference method that is highly sensitive to changes in SAINT tables and order. Unlike eVars, props, and standard OOTB dimensions, classified variables are referenced by segments using a back-end numeric ID (aka ‘Div’) prior to use in reporting. These Div#’s are set at the time AA Classifications are added, incremented by 1 from top-to-bottom.
Similarly, the segment reference Div is set at the time each segment is created and - depending on the type of change being made to Classifications in AA - there can be several (potentially severe) business and reporting impacts:
AA Classification Order
|
Backend Numeric ID
|
Segment Reference
|
Classification 1
|
Div1
|
Div1
|
Classification 2
|
Div2
|
Div2
|
Classification 3
|
Div3
|
Div3
|
Similarly, the segment reference Div is set at the time each segment is created and - depending on the type of change being made to Classifications in AA - there can be several (potentially severe) business and reporting impacts:
- DELETING a classification in AA removes that variable and its associated Div data from the backend. Any segment which leveraged the classified variable linked to that Div will then become ‘Undefined’ and segments will not function as intended.
- EDITING or renaming a classification in AA has no effect on the Div itself. However, any segment which uses that modified classification will now show the updated name provided.
- ADDING a classification will always default to incrementing by +1 on the highest Div# in the system, even if a lower Div# was deleted. For example, deleting Classification 2 from the above table and subsequently adding a new Classification will create Div4; Div2 (Classification 2) will still not exist. In order to add Div2 (in this example), the user must delete all Classifications below 1, add Classification 2 again, and re-upload the desired SAINT tables.
Late last year I wrote an article about data variance between Web Analytics tools and mentioned that there is a way to reconcile missing eCommerce orders on the purchase page with the help of Adobe engineering. This can be accomplished by passing s.transactionID on a page prior to a confirmation page and data can be reconciled by engineering. The trick is to pass the s.transactionID right before the purchase page on the page (client side) as well as in the backend (server side).
This is not straightforward as it will require additional support from your developers but our engineering team can take the two transactionIDs, reconcile them in the backend then backfill any missing orders. Please note that there will be an additional cost to work with engineering as well so ONLY do this if it's absolutely necessary.
3. Sending Historical Data To Adobe Analytics Is Possible BUT..
This is accomplished by passing over historical attributes (eVars, props, events etc.) along with the timestamp variable (s.timestamp) but the main issue is out of order nature of events being sent in as it can affect the sequence of events and attribution of eVars. The only use case I've seen for this is backfilling historical or missing data and should ONLY be done when you have no other option. I've included a sample below using a GET request via the Data Insertion API method.
Notice that the timestamp is for a previous time frame. I'd also suggest changing the timestamp setting in your report suite to optional for best results especially if you're using a global setup capturing both mobile app and desktop data.
http://(tracking server)/b/ss/(rsid)/0?AQB=1&pccr=true&g=none&ts=1559335477&products=;testvarn;1;5.95&events=event1&pageName=testPN&mid=5670954657508492467610283427418697012&AQE=1
4. Possible To Send Negative Revenue In The Products String
One other rare scenario is where your client side code accidentally captures a really large order value which messes up your overall Revenue metric. This happened recently for a client where they accidentally sent over a Million dollars for a single SKU. There is a way to fix that by sending an equivalent negative revenue amount for the same SKU for which you accidentally sent the wrong amount. I've demonstrated it using a small amount of $400 but the concept is the same.
This shows what the products string will look like when we send negative revenue amount. Note that we sent a different purchase ID as the same purchase ID cannot be used to send a different order as duplicate order IDs are ignored by Adobe.
Finally, we can see that the overall revenue for SKU123ABC is now $0 as we sent a negative revenue amount. The other thing to consider is that you'll have an additional Orders event captured as we used a new purchase ID but in the larger scheme of things, you can live with that and ignore it via segmentation.
This is from a recent test I did with a client while testing A4T where Analytics is deployed server side but Target is deployed client side. The data is sent to Adobe Analytics via Data Insertion API and the challenge was that there was no concept of a Supplemental Data ID matching Analytics and Target which is used to tie Analytics with Target client side.
In this case, the fix is to send the tntA variable (captures A/B test variant information) which can be passed as a query string parameter both on page view and click events. This ensures that client side Target matches up with Adobe Analytics deployed server side and populates the A4T reports. Below is a sample (event) snippet you can send to test this which will populate the Analytics for Target reports in Adobe Analytics. Note that we've added pe=tnt to denote that it's a click event other for page views, you'll simple pass the tnta param without pe=.
http://(tracking server)/b/ss/(rsid)/0?ts=1559335477&products=;testpev;1;5.95&events=event1&pageName=testpev&mid=5670954657508492467610283427418697012&pe=tnt&tnta=291105:0:0|1
6. Submit Expected Traffic Spikes Well In Advance
A lot of businesses have expected traffic spikes each year during seasonal events such as Thanksgiving and it's quite common for them to schedule traffic spikes for their Adobe Analytics report suites. However, I've seen a lot of cases where these spikes are scheduled quite late which leads to delayed data processing of their data.
To avoid that, it's better to schedule your expected traffic spikes well in advance to avoid any data processing delays or issues. As an example, expected spikes above 25,000,000 server calls should be submitted one or more months in advance which gives Adobe enough time to allocate any necessary hardware. Here's a summary of how much lead time is needed by traffic volume.
So that's it! Hopefully you found some of these uncommon scenarios pertaining to Adobe Analytics helpful.
No comments:
Post a Comment