Measurements To quantify success, we need to track the following metrics: - Sentiment measures -- Full questionnaire - Heavy user traits -- Page views -- Average number of tabs -- usage hours/session hours - Additional measurements (useful for further product/feature development, qualifying results etc.) -- Number of searches (Yahoo etc…) -- Number of searches for tabs in the side bar -- Number of mode switches (tabs on top mode vs. side tabs mode) -- Frequency of mode switches -- Time spent in each mode -- Number of tabs in each mode -- Number of tabs on switch --- The goal here is to know what tab number causes users to switch from one mode into another. We need separate measures for the two switching »directions« -- Progress/dropoff rate for onboarding --- Outlined in more detail in the onboarding spec -- Click through rate when the hint is shown (when a user is in tabs-on-top mode, we can show a hint to switch to tab center) Segmentation - Time spent in side tabs mode - Influencers Hypotheses (formal version) Primary - Sentiment measures of A > Sentiment measures of B (by 2%+) - Number of pageviews of C > Number of pageviews of D (by 1%+) - Retention of E >= than retention of F (by 1%+) Secondary - Total usage hours of A, C > total usage hours of B, D, respectively - Number of searches in A, C, E >= Number of searches in B, D, F, respectively Note: Any one of these can be considered success so long as it is not accompanied by a drop in the key metrics for the other success criteria. (Eg. If sentiment of A goes up relative to B, but retention of E gets worse than retention of F, that wouldn’t necessarily be considered success) Experiment duration We believe that we can see results within one month from enrollment. We are targeting mid-August 2017 for the launch.
The telemetry for the experiment can be found here: https://github.com/bwinton/TabCenter/blob/master/docs/metrics.md
Saptarshi, can you validate the permissibly to collect the telemetry data linked above?
Created attachment 8910485 [details] tab-center.xpi
Created attachment 8910748 [details] tab-center.xpi
We plan on shipping this in Release 56. Can we get the add-on signed please?
Since this is going to release, we need sign-off on feature testing from a QE team. If ya'll already have it, can you link to it from this bug?
Hi guys, We have ran our set of manual tests on Firefox 56 using the provided build and we uncovered a couple of issues. All of them were addressed and are verified as fixed (for more details about the testing we have done, please access the following link: https://testrail.stage.mozaws.net/index.php?/plans/view/6374). We found another issue today (https://github.com/bwinton/TabCenter/issues/1127), but since this isn't something major, from manual QA perspective we consider that we are good to go.
Created attachment 8913773 [details] tab-center-1.37.0-signed.xpi I talked with Erica on Slack, and she clarified that the XPI uploaded in comment 5 includes the fixes mentioned in comment 8. Given that, I've signed it, and uploaded it Shield's admin interface. I've also attached it here.
Sounds good, please go ahead with the rollout for release 56.
Deployed a CEP filter to track DAU in this study: https://pipeline-cep.prod.mozaws.net/dashboard_output/analysis.jgaunt.tabcenter_estimates.tabcentertest1_count.json