Closed Bug 1273983 Opened 8 years ago Closed 8 years ago

Allow enabling/disabling a performance framework

Categories

(Tree Management :: Perfherder, defect)

defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: wlach, Assigned: ShrutiJ)

Details

Attachments

(2 files, 4 obsolete files)

To enable testing things like bug 1272176 on stage only, I'd like to add an "enabled" property to the performance framework model (defaulting to False) and let the user override the setting in the admin panel. Unless enabled is set to true, any performance data tagged with that framework will *not* be ingested. This will let us test out new performance data ingestion on stage without jeoprodizing production stability.

I think we can do this with the following changes:

1. Add an enabled flag to the PerformanceFramework model in treeherder/perf/models.py (should be a boolean, default to False: check docs)
2. Modifying treeherder/model/fixtures/performance_framework.json to set the enabled flag to True for all existing repositories (I'm not sure if loaddata will update existing instances of a model, so we'll have to check that behavior).
3. Modifying the admin settings (treeherder/webapp/admin.py) to enable modification of the PerformanceFramework entries.
4. Modifying treeherder/etl/perf.py to skip performance blobs if the framework is not enabled.
Some modifications to this plan, after I learned more about how loaddata works:

(In reply to William Lachance (:wlach) from comment #0)
> To enable testing things like bug 1272176 on stage only, I'd like to add an
> "enabled" property to the performance framework model (defaulting to False)
> and let the user override the setting in the admin panel. Unless enabled is
> set to true, any performance data tagged with that framework will *not* be
> ingested. This will let us test out new performance data ingestion on stage
> without jeoprodizing production stability.
> 
> I think we can do this with the following changes:
> 
> 1. Add an enabled flag to the PerformanceFramework model in
> treeherder/perf/models.py (should be a boolean, default to False: check docs)
> 2. Modifying treeherder/model/fixtures/performance_framework.json to set the
> enabled flag to True for all existing repositories (I'm not sure if loaddata
> will update existing instances of a model, so we'll have to check that
> behavior).

We will do this, however I realized that this method will mean that anything in the fixture file will be blown away whenever we migrate stuff. I think we need to just move away from a model where this data is stored in a fixture, instead creating the framework on ingestion -- but *only* creating the framework, not enabling it.

To grandfather in existing data though, we'll modify the existing fixtures in performance_framework.json to set "enabled" to true for everything so there is no interruption of service. So e.g. change something like:

```
        "pk": 1,
        "model": "perf.PerformanceFramework",
        "fields": {
            "name": "talos"
        }
```

to:


```
        "pk": 1,
        "model": "perf.PerformanceFramework",
        "fields": {
            "name": "talos",
            "enabled": true
        }
```

We will not add any new performance frameworks to the list in the future, we will rely on the admin's to enable them on a case-by-case basis.

> 3. Modifying the admin settings (treeherder/webapp/admin.py) to enable
> modification of the PerformanceFramework entries.
> 4. Modifying treeherder/etl/perf.py to skip performance blobs if the
> framework is not enabled.

So as hinted at above, two changes here:

- *create* the framework if it doesn't already exist (but use the default of enabled=False)
- write a log message stating that we're not ingesting data from non-enabled performance frameworks
Shruti, do you want to take this on? It's a lot to take in and learn about, so feel free to ask questions. It's worth learning about the following concepts:

* django models
* django migrations
* django fixtures
* django admin

... as well as various aspects of the perfherder data model. Feel free to ask questions if anything is unclear!
Assignee: nobody → shrutijasoria1996
I'll start learning about perfherder and django. Can you suggest a good place to learn about perfherder data model?
Flags: needinfo?(wlachance)
I would suggest working through the django tutorial (https://docs.djangoproject.com/en/1.9/intro/tutorial01/), then taking a look at the following resources:

* https://github.com/mozilla/treeherder/blob/master/treeherder/perf/models.py
* https://github.com/mozilla/treeherder/blob/master/treeherder/webapp/api/performance_data.py

At heart, Perfherder is just storing performance counters of various kinds in a database and then serving them up via a web API. It's a pretty standard CRUD (create, read, update) web application.
Flags: needinfo?(wlachance)
I have familiarized myself with the django concepts you mentioned in comment 3. I have also gone through the above files. What should I do next?
Flags: needinfo?(wlachance)
(In reply to Shruti Jasoria [:ShrutiJ] from comment #5)
> I have familiarized myself with the django concepts you mentioned in comment
> 3. I have also gone through the above files. What should I do next?

Sorry for the late reply, I was off for the past couple days.

I would recommend starting by adding the "enabled" field to the PerformanceFramework model (defaulting to False) and modifying the existing fixtures (treeherder/perf/performance_framework.json) to set enabled to "True". Then create a migration for the change (`./manage.py makemigrations -n "add_performanceframework_enabled"`) and start working on adding code to only ingest performance data (treeherder/etl/perf.py) if the framework is enabled. At that point you could write a unit test to verify your code (a very similar test is in tests/etl/test_perf_data_adapters.py around line 271, search for "test_no_performance_framework")
Flags: needinfo?(wlachance)
Attached file Required changes minus the tests (obsolete) —
Here are the changes which I have made. Do I need to write tests in the following file or write a new one?

https://github.com/mozilla/treeherder/blob/master/tests/etl/test_perf_data_adapters.py
Attachment #8765732 - Flags: feedback?(wlachance)
Comment on attachment 8765732 [details] [review]
Required changes minus the tests

There are some problems in the implementation, but this is a good start. Go ahead and write some tests in the existing file, when you're ready for that (focus on making the existing tests pass first).
Attachment #8765732 - Flags: feedback?(wlachance) → feedback+
The changes which I have made have passed the tests locally:
http://pastebin.com/UM9fGt42

Can I go ahead with the tests now?
Flags: needinfo?(wlachance)
Sure thing! No need to ask me for permission if you think you're on the right track (it looks like you are). :)
Flags: needinfo?(wlachance)
Attached file Changes along with the test (obsolete) —
Here is the test along with the changes which I have written.

After correcting a nit in my code, the tests started to fail again. According to my understanding, _load_perf_artifact is being inhibited from creating the required objects as the field 'enabled' is defaulted to false.
Attachment #8765732 - Attachment is obsolete: true
Attachment #8766653 - Flags: feedback?(wlachance)
Comment on attachment 8766653 [details] [review]
Changes along with the test

So many unit tests use the test_performance_framework fixture, which would need to be changed to set "enabled" to True.

https://github.com/mozilla/treeherder/blob/52607c2a57c9f9780b2b7b96c39124db700357d2/tests/conftest.py#L488

Some tests may create their own performance framework, in which case they would need to be modified as well. Try "git grep PerformanceFramework" in the tests directory. :)
Attachment #8766653 - Flags: feedback?(wlachance)
I have made changes to all the tests which are using PerformanceFramework model. The changes have passed all the existing tests. Please take a look and let me know if any changes have to be made.

What should I do next?
Attachment #8766653 - Attachment is obsolete: true
Attachment #8766815 - Flags: feedback?(wlachance)
Comment on attachment 8766815 [details] [review]
Required changes along with modification in tests using PerformanceFramework model

This actually looks pretty close to me. Left a few comments. Also the tests on travis are failing, probably because you forgot to check in your migration.
Attachment #8766815 - Flags: feedback?(wlachance) → feedback+
I have made the changes which you had pointed out on Github.

def __getitem__(self, enabled) in treeherder/perf/models.py

https://github.com/SJasoria/treeherder/blob/22be6d1f1ec45df7ceca747e26d8e363e0410def/treeherder/perf/models.py#L26

needs to be there to allow access to 'enabled'. If its not there the following error pops up while running the tests:

File "/home/vagrant/treeherder/treeherder/etl/perf.py", line 98, in load_perf_artifact
if framework['enabled'] is False:
TypeError: 'PerformanceFramework' object has no attribute '__getitem_'
Attachment #8767040 - Flags: review?(wlachance)
Attachment #8766815 - Attachment is obsolete: true
Comment on attachment 8767040 [details] [review]
Final changes with migrations checked in

Some comments to be addressed, see the PR. Please ask for review again when ready. :)
Attachment #8767040 - Flags: review?(wlachance)
I have made the changes which you had asked for in the comment. The changes passed the tests locally. Even PerformanceFramework was accessible after I logged in via Persona.But it is giving error in Travis CI build. I am not able to understand why is that happening.
Attachment #8767040 - Attachment is obsolete: true
Attachment #8767685 - Flags: review?(wlachance)
Verified that this is working well and merged. Thank you ShrutiJ!
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → FIXED
Comment on attachment 8767685 [details] [review]
Changes along with PerformanceFramework added to admin.py

FYI there is no need to make a new attachment when updating an existing PR: just re-set the review flag on the old one and I'll see it. :)
Attachment #8767685 - Flags: review?(wlachance)
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: