Closed Bug 1220432 Opened 5 years ago Closed 4 years ago

Autophone - links to perfherder need to have the correct signatures


(Testing :: Autophone, defect)

Not set


(Not tracked)



(Reporter: bc, Assigned: jmaher)




(1 file)

you can see the two graphs here:[mozilla-inbound,f955baa4c127f70ca253fa72e29a3994966f6637,1]&series=[mozilla-inbound,302e02aaa43603f1d4eb3b38b49938c50d8a9d75,1]

it is all in the signatures.  To get this we will need to do some treeherder api calls.  I believe we already have the treeherder-client in use, so this might be fairly straightforward.  The ideal spot to do this would be in the

:bc, let me know if this use of treeherder-client sounds good to you.
Flags: needinfo?(bob)
Summary: Autophone - Talos Perfherder graph is empty → Autophone - links to perfherder need to have the correct signatures
I am all for it if we can look this up and not hard-code it. Which version of the client do we need and where are the features and documentation?
Flags: needinfo?(bob)
We shouldn't have to look it up each time we want to submit a result to Treeherder though. I would expect it to better live in PerfTest.
we need to look it up for every branch/platform/test combination- that is where the signature comes from. I honestly don't know where this is documented, I just hack around in other code to find the signature usually.
would a reasonable solution be to hard code the signatures for fx-team/inbound/central/aurora/beta/try hardcoded, and if we are not on one of those branches we look up the signature?
Flags: needinfo?(bob)
I would rank the approaches for the tests which use Perfherder (TalosTest, RoboTest and perhaps any descendant of PerfTest) as:

1. Look up the required signatures when the test object is first initialized in Autophone in

2. Hardcode all of the signatures in a py script that is imported.

3. Mix the two approaches.

If the signatures are constant and not liable to change and if the set of possible repositories is limited, we could just create the script and maintain a list of all of the supported repositories there. The cost will be the maintenance required. It would be a one time cost with the possible failure to keep up with changes.

If we aren't certain the signatures won't change or that the set of repositories will change relatively frequently, then I think looking them up each time we start is a better approach. This has the cost of having to look them up each time we start an Autophone instance but we no longer have the cost of maintaining the list manually nor have the potential downside of having a signature for a repository change over time.

Mixing the two approaches seems like the worst choice. We have the potential costs of both approaches with slightly more complicated code.

Flags: needinfo?(bob)
Great advice :bc!  the signatures are for the most part static.  If we change the number of pages or other attributes of what we are measuring, then the signatures can change.  I don't see that happening often.  I am leaning towards the static list of signatures as dynamic query of treeherder is one more possible thing to break (if it is down, if the api changes, if we get a timeout, etc.)

If we don't like that, I do like the idea of gaining the signature prior to starting the test.

Let me work on implementing the static signatures.
as discussed on irc, we have this data automatically in the peformance panel on treeherder.  This means we do not need to have anything hardcoded, nor do we need to have any api call or lookup take place.
Attachment #8729620 - Flags: review?(bob)
Closed: 4 years ago
Resolution: --- → FIXED
deployed 2016-03-15 06:29
You need to log in before you can comment on or make changes to this bug.