See bug 644687 for more background; we should talk about test approaches/strategies/automation tools that can help ensure that Webtrends tags (<script> / <noscript>) are included on at least Mozilla.com properties. While it wouldn't be the be-all-end-all, if there's a lightweight, reliable, modular/flexible solution (to test trunk/prod), we should take a look at it. (I'm thinking Selenium, obviously, but wanted to get the discussion going.)
This should be fairly trivial. We can add a item to the base page that can either be called implicitly or explicitly and then have a test added to the template. That will be disabled straight away
Not sure if bug 570583 is a good model for this, but we should revisit it and see what we can do. Over in https://intranet.mozilla.org/Webanalytics#Tracking_a_New_Page, it says, "Use the existing tracking tag on that existing domain."; Laura, a couple questions: * What are the existing tracking tags for each of the sites we track? * What does the generic tracking code snippet look like (i.e. what are the common elements we for which we could automate, and provide good coverage)? Thanks.
All: I just got word yesterday from stevend that this bug existed. I just happen to be doing the very same thing and the scripts are about done. I'm doing a final test run now and it should be done within the next 24 hours. The script had multiple purposes, but it can be paired down to specific functions. * Scan all 383 domains owned by Mozilla via an input text file. * Determine the HTTP status code of each of those domains. * Record if the website is OK, Error, or if it redirects to another domains. * On websites that are "Ok", look on their homepage for a substring that only exists if webtrends is installed, and record if analytics is installed. * On websites that have webtrends installed on the homepage, spider every publicly accessible page and determine if webtrends is installed across the website. Record the percentage of pages on the website with webtrends installed. (analytics coverage) * Output as wiki markup with all of this data. The previous version of this script was used to created this page on the Intranet: https://intranet.mozilla.org/Domain_Names The current version of the script, which is not updated on github at this moment is a more threaded version. Historically, I was sequentially looping through every domain one at a time and it took forever to run. It was not bandwidth limiting, but it was working as fast as the unix command could read and write. I switched the script to fork a separate process for every domain to be checked so it could be mirroring multiple websites concurrently. I ran it overnight last night and I was almost complete by this morning. I killed all the child processes this morning after I discovered a bug that needed to be fixed. It is rerunning now and will probably run all day and night and hopefully be finished up in the morning. After I have verified results and it is up on the wiki, I will post a note to this bug. If this looks like it will help other people, we can adapt the script to additional needs. It is just a few bash scripts that used basic unix commands. Thanks, Chris
Hey Chris - Thanks! We currently already have https://github.com/mozilla/sumo-tests/blob/master/test_webtrends_search_tracking.py, which is SUMO-specific; one of the goals of writing a one-size-fits all Webtrends test in Python is that we can have it in our CI (Jenkins), alongside a suite of other tests we'd like Engagement/other projects to pass (and keep passing) before/while we're QA'ing them. So, these two approaches actually complement each other, I think. Would still be good to talk to you about this in person, so will do so.
Well, the script finished and the wiki output looks good. I've uploaded the latest version to the wiki page below. The columns are sortable, but if you going to sort by analytics coverage, you may have to click it twice to do a correct numeric sort. https://intranet.mozilla.org/Domain_Names Overall statistics are at the bottom of the list. Stephen: Yeah, let's get a meeting set up and chat about next steps. I agree, they could compliment each other. I only wrote the scripts to help with the documentation, because who wants to manually visit 383 domains. :) There are a few sites that were skipped in the scan, because they redirected, but I'm looking into it. Chris
Um, sorry for bugspam...