Closed Bug 1229507 Opened 9 years ago Closed 9 years ago

Make UT Heartbeat Packets Easily Accessible in Spark

Categories

(Cloud Services Graveyard :: Metrics: Pipeline, defect, P3)

defect

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: glind, Assigned: mreid)

References

Details

So, HB in UT is coming. See #1193535. This is the 'rare packet, but we need them all' use case. Possible by Orlando? PLZ?!
Filed https://github.com/mozilla-services/puppet-config/pull/1678 to add the "heartbeat" document type (so that it doesn't get bucketed into "OTHER".
Gregg, do these really need to go into Redshift? If so, what do you want done with all the nested environment stuff?
Flags: needinfo?(glind)
Priority: -- → P1
Summary: Divert UT Heartbeat Packets (to Redshift) → Make UT Heartbeat Packets Easily Accessible in Spark
This is still waiting on packets coming to UT. I don't care about redshift vs some other solution. Something very very similar to the executive summary vars would be awesome. (+ crashes). Let's talk this out again soon. (In my head, i have this fantasy 'lifespan' data row, and a 'day-person' row form, but they are vague. It's the same stuff we talke about with churn and other things.)
Flags: needinfo?(glind)
will need to size this when we know more about the package.
Priority: P1 → P3
The "heartbeat" records are now flowing in to the pipeline, and are accessible via Spark using the 'get_pings' functionality, similar to "main" pings (specifying doc_type="heartbeat"). If we need a specific derived dataset, let's file a new bug to specify/implement it.
Status: NEW → RESOLVED
Closed: 9 years ago
Resolution: --- → FIXED
Product: Cloud Services → Cloud Services Graveyard
You need to log in before you can comment on or make changes to this bug.