Closed Bug 789519 Opened 13 years ago Closed 13 years ago

when running xperf tests, do not upload regular test results to graph server

Categories

(Testing :: Talos, defect)

All
Windows 7
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: jmaher, Assigned: jmaher)

Details

Attachments

(1 file)

Now that we are running tp5 on xperf we need to not post the results from the test itself to the graph server because we will be posting conflicting data for the main test. Some options: 1) do not upload tp5n_paint results and tp5n_shutdown results 2) make the results tp5n_paint_xperf (which will require graph server additions).
ack, this is hacky but it seems to work. Did a few try runs and they are going well.
Assignee: nobody → jmaher
Status: NEW → ASSIGNED
Attachment #659789 - Flags: review?(jhammel)
Comment on attachment 659789 [details] [diff] [review] do not post results (or shutdown) of we have xperf counters (1.0) + if activeTests == "tp5n" and self.config.get('xperf_path', ''): + global_overrides['shutdown'] = False So at the very best, this is awful and should be documented. Why is there this change? Are we passing the --shutdown flag and we want to override? Also, shouldn't we use xperf_counters to tell if xperf is running? Why activeTests == 'tp5n'? Do we really need that extra level of magic? + if not test.using_xperf: Etc. Should probably at least add a comment explaining why we're doing this (or not doing this, as the case may be).
Comment on attachment 659789 [details] [diff] [review] do not post results (or shutdown) of we have xperf counters (1.0) So this is pretty hacky, but issues discused over irc. r+, given that all of this should be heavily commented as far as why its being done in this hacky way
Attachment #659789 - Flags: review?(jhammel) → review+
Status: ASSIGNED → RESOLVED
Closed: 13 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: