Document stub installer pings on dtmo
Categories
(Data Science :: Documentation, task)
Tracking
(Not tracked)
People
(Reporter: tdsmith, Assigned: ethompson)
References
()
Details
Stub installer pings presently aren't documented on dtmo.
The canonical documentation is in the tree: https://searchfox.org/mozilla-central/source/browser/installer/windows/docs/StubPing.rst
They are formed and sent from NSIS code (!) in the stub installer, in the SendPing
subroutine: https://searchfox.org/mozilla-central/source/browser/installer/windows/nsis/stub.nsi
They are processed into Redshift by https://github.com/whd/dsmo_load (I think).
The Redshift tables are accessible from the DSMO-RS data source in STMO.
In Redshift, it looks like a table is created for each calendar day; download_stats
and download_stats_year
are views that union all (or a year's worth) of the per-day tables together, which makes e.g. SELECT * LIMIT 10
operations on them quite slow.
Updated•6 years ago
|
Reporter | ||
Comment 1•6 years ago
|
||
Confirmed via Bug 1520794 that whd/dsmo_load is the repo that's used in production.
Assignee | ||
Comment 2•6 years ago
|
||
Started some comments in this doc (which for now only really amalgamates your comments above). Was thinking this could live under "Dataset Specific " on DTMO. Thoughts? Comments?
Reporter | ||
Comment 3•6 years ago
|
||
I added comments with a couple of references to answer some of the questions you had; it's probably worth linking out to the Funnelcake wiki page. Otherwise, you know as much as I do π
I think I would put this under 6.5, Dataset Reference - Other Datasets.
I think this would be a valuable contribution to dtmo as-is. Opening a DTMO PR and flagging robotblake and/or whd for review, and for assistance fleshing out anything you think is missing, is probably a good next step.
Reporter | ||
Updated•6 years ago
|
Assignee | ||
Updated•6 years ago
|
Description
•