Closed
Bug 1130090
Opened 8 years ago
Closed 8 years ago
Set up new builds on larch
Categories
(Release Engineering :: General, defect)
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: catlee, Assigned: bhearsum)
References
Details
Attachments
(9 files, 4 obsolete files)
23.91 KB,
patch
|
fabrice
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
4.76 KB,
patch
|
jlund
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
3.04 KB,
patch
|
jlund
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
10.21 KB,
patch
|
jlund
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
4.77 KB,
patch
|
jlund
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
4.87 KB,
patch
|
jlund
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
1.73 KB,
patch
|
jlund
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
537 bytes,
patch
|
catlee
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
2.35 KB,
patch
|
jlund
:
review+
bhearsum
:
checked-in+
|
Details | Diff | Splinter Review |
We need to create a few new builds to support the graphene project on the larch branch in addition to the regular builds. Supported platforms will be linux64, osx64 and win64 (ideally). I think we'll want debug builds as well. We'll need to do nightly builds w/ updates. These are similar to b2g desktop builds w/ different mozconfigs. Fabrice will be landing those on larch.
Assignee | ||
Comment 1•8 years ago
|
||
Fabrice, can you let me know when you've got mozconfigs landed, and point me at them?
Assignee: nobody → bhearsum
Flags: needinfo?(fabrice)
Comment 2•8 years ago
|
||
Hi Ben, I just landed them! Here they are: https://hg.mozilla.org/projects/larch/file/d8f03881cda8/b2g/config/mozconfigs I hacked the win64 ones without any testing, so I would not be surprised if they are broken.
Flags: needinfo?(fabrice)
Comment 3•8 years ago
|
||
And the config files we want are actually at https://hg.mozilla.org/projects/larch/file/d8f03881cda8/b2g/graphene/config/mozconfigs
Assignee | ||
Comment 4•8 years ago
|
||
(In reply to Fabrice Desré [:fabrice] from comment #3) > And the config files we want are actually at > https://hg.mozilla.org/projects/larch/file/d8f03881cda8/b2g/graphene/config/ > mozconfigs Fabrice and I talked briefly on IRC - we're going to s/gecko/graphene/ in these names to avoid confusion.
Assignee | ||
Comment 5•8 years ago
|
||
Doing updates here could end up being a lot of work. We've never done updates for b2g desktop builds, so that style of build w/ updates is completely untested. B2G desktop builds still use MercurialBuildFactory, so these will probably be more like Firefox updates (pre mozharness) than b2g device updates.
Assignee | ||
Comment 6•8 years ago
|
||
Another thing to be careful of is colliding with existing files on FTP. "Regular" b2g desktop builds go to files such as: http://ftp.mozilla.org/pub/mozilla.org/b2g/nightly/2015-02-17-01-02-32-mozilla-central/b2g-38.0a1.multi.linux-x86_64.tar.bz2 "localizer" b2g desktop builds set MOZ_PKG_SPECIAL=localizer and go to files like: http://ftp.mozilla.org/pub/mozilla.org/b2g/nightly/2015-02-17-01-02-32-mozilla-central/b2g-38.0a1.multi.linux-x86_64-localizer.tar.bz2 Mulet builds got lucky and use en-US instead of multi as their locale (otherwise they would've collided): http://ftp.mozilla.org/pub/mozilla.org/b2g/nightly/2015-02-17-01-02-32-mozilla-central/firefox-38.0a1.en-US.linux-x86_64.tar.bz2 We'll probably need to set MOZ_PKG_SPECIAL here.
Assignee | ||
Comment 7•8 years ago
|
||
I noticed that all of these are under "b2g/graphene", so I just named these "linux32", et. al. I also updated them all in a few ways: 1) Stop referencing anything in b2g/config/mozconfigs 2) Disable update packaging and other update stuff for debug builds (we don't do updates for debug builds) 3) Remove a bunch of unnecessary win64 mozconfigs 4) Make the win64 mozconfigs as alike to the win32 ones as possible (they were massively different before, and I don't think there was any reason for them to be). These are untested so we won't now for certain that they work until builds start going, but it's a good starting point.
Attachment #8566148 -
Flags: review?(fabrice)
Assignee | ||
Comment 8•8 years ago
|
||
A couple of random questions * Are we calling this thing in terms of application name? The fact that it's a different "--enable-application" than other b2g stuff suggests to me that it should be called "Graphene" or something similar. This influences a lot of things such as: upload location (http://ftp.mozilla.org/pub/mozilla.org/b2g/ vs. http://ftp.mozilla.org/pub/mozilla.org/graphene/), name in update URL (https://aus4.mozilla.org/update/3/B2G/.... vs. https://aus4.mozilla.org/update/3/Graphene/....), builder names, etc. A lot of these things are very difficult to change later, so I'd like to get them right up front if possible. * Are we doing single locale or multilocale builds? My assumption right now is single locale, en-US only. ** If we're doing multilocale, where is the locales file? (Eg, we use http://mxr.mozilla.org/mozilla-central/source/b2g/locales/all-locales for b2g desktop builds. We'll almost certainly want one under b2g/graphene somewhere if we're doing multilocale.) I might have a follow-up or two once these are answered and I get deeper into this.
Flags: needinfo?(fabrice)
Comment 9•8 years ago
|
||
(In reply to Ben Hearsum [:bhearsum] from comment #8) > A couple of random questions > * Are we calling this thing in terms of application name? The fact that it's > a different "--enable-application" than other b2g stuff suggests to me that > it should be called "Graphene" or something similar. This influences a lot > of things such as: upload location Yes, let's call it "Graphene" > * Are we doing single locale or multilocale builds? My assumption right now > is single locale, en-US only. Single locale good enough for now. Does that prevent us from adding multilocale builds later? > ** If we're doing multilocale, where is the locales file? (Eg, we use > http://mxr.mozilla.org/mozilla-central/source/b2g/locales/all-locales for > b2g desktop builds. We'll almost certainly want one under b2g/graphene > somewhere if we're doing multilocale.) We are building /b2g, so don't we get the b2g/locales too?
Flags: needinfo?(fabrice)
Comment 10•8 years ago
|
||
Comment on attachment 8566148 [details] [diff] [review] move + update graphene mozconfigs Review of attachment 8566148 [details] [diff] [review]: ----------------------------------------------------------------- thanks!
Attachment #8566148 -
Flags: review?(fabrice) → review+
Assignee | ||
Comment 11•8 years ago
|
||
(In reply to Fabrice Desré [:fabrice] from comment #9) > (In reply to Ben Hearsum [:bhearsum] from comment #8) > > A couple of random questions > > * Are we calling this thing in terms of application name? The fact that it's > > a different "--enable-application" than other b2g stuff suggests to me that > > it should be called "Graphene" or something similar. This influences a lot > > of things such as: upload location > > Yes, let's call it "Graphene" OK, will do. > > * Are we doing single locale or multilocale builds? My assumption right now > > is single locale, en-US only. > > Single locale good enough for now. Does that prevent us from adding > multilocale builds later? This won't prevent us from doing additional things later, I just needed to know what to start with. > > ** If we're doing multilocale, where is the locales file? (Eg, we use > > http://mxr.mozilla.org/mozilla-central/source/b2g/locales/all-locales for > > b2g desktop builds. We'll almost certainly want one under b2g/graphene > > somewhere if we're doing multilocale.) > > We are building /b2g, so don't we get the b2g/locales too? I suspect this will cause pain down the road. If Graphene is truly a separate app, we should avoid depending on things in b2g/. As a concrete example, if we use the same locales files as b2g desktop/device/other builds use, we won't be able to configure their locales independently. With that said, I'm not sure we need a locales file at all yet since we're doing single locale builds.
Comment 12•8 years ago
|
||
Really, the locale content for graphene should be 100% similar to b2g, since we have basically no UI. But let's go single locale for now.
Assignee | ||
Comment 13•8 years ago
|
||
(In reply to Fabrice Desré [:fabrice] from comment #12) > Really, the locale content for graphene should be 100% similar to b2g, since > we have basically no UI. But let's go single locale for now. Past experience tells me that it's very likely this will change. But let's cross that bridge when we come to it.
Assignee | ||
Updated•8 years ago
|
Attachment #8566148 -
Flags: checked-in+
Assignee | ||
Comment 14•8 years ago
|
||
Quick status update: I have a 64-bit Linux Graphene build that was built through buildbot+mozharness. There's still lots to do, but it's a milestone at least. One thing that I realized is that updates may not work unless nsUpdateService is actually getting built, and there's chrome to support the application of updates. That doesn't mean we can generate MARs and publish metadata in the meantime though.
Assignee | ||
Comment 15•8 years ago
|
||
I think this should be pretty straightforward, it's just enabling Graphene for Larch. Fabrice, these are going to end up in /pub/mozilla.org/firefox, at least for now. I'm trying to figure out if it's worth the trouble creating /pub/mozilla.org/graphene given that ftp is going away before EOY.
Attachment #8567940 -
Flags: review?(jlund)
Assignee | ||
Comment 16•8 years ago
|
||
This has gotten me as far as a Graphene build in staging, though it doesn't complete all the postrelease steps beacuse the taskcluster upload fails. I want to get this enabled on Larch and push forward there rather than fight with staging.
Attachment #8567943 -
Flags: review?(jlund)
Assignee | ||
Comment 17•8 years ago
|
||
I was hitting errors for missing keys for a bunch of stuff when I first tried to create this builder. Turns out it's beacuse this block was getting executing even if factory_kwargs ended up unused. Moving this inside the deeper block means we don't need to set stage_platform and a bunch of other unused stuff for mozharness-only platforms.
Attachment #8567944 -
Flags: review?(jlund)
Comment 18•8 years ago
|
||
Comment on attachment 8567944 [details] [diff] [review] don't require a bunch of extra config for mozharness-only platforms Review of attachment 8567944 [details] [diff] [review]: ----------------------------------------------------------------- ::: misc.py @@ -1688,5 @@ > mozharness_repo_cache = pf.get('mozharness_repo_cache') > > # Some platforms shouldn't do dep builds (i.e. RPM) > if pf.get('enable_dep', True) or pf.get('enable_periodic', False): > - factory_kwargs = { I think this okay. we reuse factory_kwargs in other variants below (factory_kwargs.copy()) in things like pgo and noprofiling. which originally I thought this would cause a bug if an opt build via mh was done and then we didn't say do a variant of opt with mh. Maybe that's not an issue. either way, since you removed all of the old MBF config items for things like graphene, this sounds needed. Plus we will have to do it when we 'prune' desktop builds bbot-cfg items.
Attachment #8567944 -
Flags: review?(jlund) → review+
Comment 19•8 years ago
|
||
Comment on attachment 8567940 [details] [diff] [review] enabled graphene builds on larch Review of attachment 8567940 [details] [diff] [review]: ----------------------------------------------------------------- iiuc, larch was being used for android stuff. did you want to remove the android builders or run them side by side? I'll r+ so I don't block as I think this won't break anything but I have some sanity check questions in b2g_project_branches.py ::: mozilla/b2g_config.py @@ +59,5 @@ > 'dolphin-512_eng': {}, > + > + # Graphene builds. These are a different app (ie, not B2G) and would > + # have their own config files in an ideal world, but it's not worth > + # the effort at this point. sounds good. Do you forsee our list of graphene variants and platforms grow past this single one? If so, I wonder at what point we should rip this out of b2g_config.py @@ +1562,5 @@ > 'enable_dep': False, > }, > + > + > + "linux64_graphene": { wow, it's so nice to see such a small list here! :) ::: mozilla/b2g_project_branches.py @@ +113,5 @@ > # 'gum': {}, > # disabled for bug 985718 > #'holly': {}, > 'jamun': {}, > + 'larch': { uncommenting larch will add all the b2g builders fyi. I think that will be about ~25 builders. Do we need them? @@ +114,5 @@ > # disabled for bug 985718 > #'holly': {}, > 'jamun': {}, > + 'larch': { > + "desktop_mozharness_builds_enabled": True, so I think this will have a fun side effect: it will add mulet builds on larch to use mh because of this thing that was copied from linux64[1]. I think that's okay. Catlee spun mulet mh build but mentioned the tooltool paths were incorrect or something? [1] http://mxr.mozilla.org/build/source/buildbot-configs/mozilla/b2g_config.py#436 @@ +116,5 @@ > 'jamun': {}, > + 'larch': { > + "desktop_mozharness_builds_enabled": True, > + "platforms": { > + "linux64_graphene": {}, you're not locking this platforms dict and the only other reason we would want to do this is to add branch-platform specific stuff but your "linux64_graphene" is empty {}
Attachment #8567940 -
Flags: review?(jlund) → review+
Comment 20•8 years ago
|
||
Comment on attachment 8567943 [details] [diff] [review] add graphene subconfig Review of attachment 8567943 [details] [diff] [review]: ----------------------------------------------------------------- lgtm so far :) ::: mozharness/mozilla/building/buildbase.py @@ +281,5 @@ > 'asan-and-debug': 'builds/releng_sub_%s_configs/%s_asan_and_debug.py', > 'stat-and-debug': 'builds/releng_sub_%s_configs/%s_stat_and_debug.py', > 'mulet': 'builds/releng_sub_%s_configs/%s_mulet.py', > 'code-coverage': 'builds/releng_sub_%s_configs/%s_code_coverage.py', > + 'graphene': 'builds/releng_sub_%s_configs/%s_graphene.py', how do you like my magic :)
Attachment #8567943 -
Flags: review?(jlund) → review+
Assignee | ||
Comment 21•8 years ago
|
||
(In reply to Jordan Lund (:jlund) from comment #18) > Comment on attachment 8567944 [details] [diff] [review] > don't require a bunch of extra config for mozharness-only platforms > > Review of attachment 8567944 [details] [diff] [review]: > ----------------------------------------------------------------- > > ::: misc.py > @@ -1688,5 @@ > > mozharness_repo_cache = pf.get('mozharness_repo_cache') > > > > # Some platforms shouldn't do dep builds (i.e. RPM) > > if pf.get('enable_dep', True) or pf.get('enable_periodic', False): > > - factory_kwargs = { > > I think this okay. we reuse factory_kwargs in other variants below > (factory_kwargs.copy()) in things like pgo and noprofiling. which originally > I thought this would cause a bug if an opt build via mh was done and then we > didn't say do a variant of opt with mh. Maybe that's not an issue. That's good to know. I'm going to run dump_master.py to make sure I'm not making any unwanted changes. I didn't test existing build types in staging. (In reply to Jordan Lund (:jlund) from comment #19) > Comment on attachment 8567940 [details] [diff] [review] > enabled graphene builds on larch > > Review of attachment 8567940 [details] [diff] [review]: > ----------------------------------------------------------------- > > iiuc, larch was being used for android stuff. did you want to remove the > android builders or run them side by side? I...don't know. I'll see if there's anything we should be shutting off here separately. > > ::: mozilla/b2g_config.py > @@ +59,5 @@ > > 'dolphin-512_eng': {}, > > + > > + # Graphene builds. These are a different app (ie, not B2G) and would > > + # have their own config files in an ideal world, but it's not worth > > + # the effort at this point. > > sounds good. Do you forsee our list of graphene variants and platforms grow > past this single one? If so, I wonder at what point we should rip this out > of b2g_config.py I don't know for sure, but I suspect we'll have at least debug builds at some point. Frankly, I didn't want to go to the trouble of setting up a ton of parallel configs given that we're moving to taskcluster in the forseeable future. > ::: mozilla/b2g_project_branches.py > @@ +113,5 @@ > > # 'gum': {}, > > # disabled for bug 985718 > > #'holly': {}, > > 'jamun': {}, > > + 'larch': { > > uncommenting larch will add all the b2g builders fyi. I think that will be > about ~25 builders. Do we need them? Not sure. Will look into this at the same time as the Android stuff. > @@ +114,5 @@ > > # disabled for bug 985718 > > #'holly': {}, > > 'jamun': {}, > > + 'larch': { > > + "desktop_mozharness_builds_enabled": True, > > so I think this will have a fun side effect: it will add mulet builds on > larch to use mh because of this thing that was copied from linux64[1]. I > think that's okay. Catlee spun mulet mh build but mentioned the tooltool > paths were incorrect or something? I dunno. I'll leave this as-is for now and come back to it. Either catlee's patch will fix it, I'll disable it, or I'll flip the platform level flag to make it MBF again. > [1] > http://mxr.mozilla.org/build/source/buildbot-configs/mozilla/b2g_config. > py#436 > > @@ +116,5 @@ > > 'jamun': {}, > > + 'larch': { > > + "desktop_mozharness_builds_enabled": True, > > + "platforms": { > > + "linux64_graphene": {}, > > you're not locking this platforms dict and the only other reason we would > want to do this is to add branch-platform specific stuff but your > "linux64_graphene" is empty {} This is leftover from what I thought I could add the platform here to enable it just for this branch. Turns out I had to add it to default list too (yay =\), and then remove it for every other branch. Mind if I leave this in for now, at least until I figure out if I'll be disabling other platforms (at which time I'd need to set lock_platforms).
Assignee | ||
Comment 22•8 years ago
|
||
Builder list difference vs. master: ➜ buildbotcustom git:(graphene-builds) diff -Naur ~/tmp/{before,after}-builders.txt --- /home/bhearsum/tmp/before-builders.txt 2015-02-24 12:53:38.656383038 -0500 +++ /home/bhearsum/tmp/after-builders.txt 2015-02-24 12:54:10.360610194 -0500 @@ -843,6 +843,32 @@ Win32 Mulet pine build NightlyBuildFactory b2g_pine_win32_gecko build NightlyBuildFactory b2g_pine_win32_gecko-debug build NightlyBuildFactory +b2g_larch_emulator_dep ScriptFactory +b2g_larch_emulator-debug_dep ScriptFactory +b2g_larch_emulator-jb_dep ScriptFactory +b2g_larch_emulator-jb-debug_dep ScriptFactory +b2g_larch_emulator-kk_periodic ScriptFactory +b2g_larch_emulator-kk-debug_periodic ScriptFactory +b2g_larch_flame-kk_periodic ScriptFactory +b2g_larch_flame-kk_eng_dep ScriptFactory +b2g_larch_flame-kk_eng-debug_periodic ScriptFactory +b2g_larch_linux32_gecko build NightlyBuildFactory +b2g_larch_linux32_gecko-debug build NightlyBuildFactory +b2g_larch_linux64-b2g-haz_dep ScriptFactory +Linux x86-64 Mulet larch build ScriptFactory +b2g_larch_linux64_gecko build NightlyBuildFactory +b2g_larch_linux64_gecko-debug build NightlyBuildFactory +graphene_larch_linux64 build ScriptFactory +OS X Mulet larch build NightlyBuildFactory +b2g_larch_macosx64_gecko build NightlyBuildFactory +b2g_larch_macosx64_gecko-debug build NightlyBuildFactory +b2g_larch_nexus-4_periodic ScriptFactory +b2g_larch_nexus-4_eng_periodic ScriptFactory +b2g_larch_nexus-5-l_periodic ScriptFactory +b2g_larch_nexus-5-l_eng_periodic ScriptFactory +Win32 Mulet larch build NightlyBuildFactory +b2g_larch_win32_gecko build NightlyBuildFactory +b2g_larch_win32_gecko-debug build NightlyBuildFactory b2g_fx-team_dolphin_periodic ScriptFactory b2g_fx-team_dolphin_eng_periodic ScriptFactory b2g_fx-team_emulator_dep ScriptFactory dump_master.py looked sane as well, except that due to ordering issues, it appears that Maple is newly added (because diff turns the maple steps into larch ones, and then adds maple anew afterwards)
Comment 23•8 years ago
|
||
> > I think this okay. we reuse factory_kwargs in other variants below > > (factory_kwargs.copy()) in things like pgo and noprofiling. which originally > > I thought this would cause a bug if an opt build via mh was done and then we > > didn't say do a variant of opt with mh. Maybe that's not an issue. > > That's good to know. I'm going to run dump_master.py to make sure I'm not > making any unwanted changes. I didn't test existing build types in staging. sounds like nothing came up in dump_masters comment 22 > > iiuc, larch was being used for android stuff. did you want to remove the > > android builders or run them side by side? > > I...don't know. I'll see if there's anything we should be shutting off here > separately. cool > > sounds good. Do you forsee our list of graphene variants and platforms grow > > past this single one? If so, I wonder at what point we should rip this out > > of b2g_config.py > > I don't know for sure, but I suspect we'll have at least debug builds at > some point. Frankly, I didn't want to go to the trouble of setting up a ton > of parallel configs given that we're moving to taskcluster in the forseeable > future. makes sense > > uncommenting larch will add all the b2g builders fyi. I think that will be > > about ~25 builders. Do we need them? > > Not sure. Will look into this at the same time as the Android stuff. k > > so I think this will have a fun side effect: it will add mulet builds on > > larch to use mh because of this thing that was copied from linux64[1]. I > > think that's okay. Catlee spun mulet mh build but mentioned the tooltool > > paths were incorrect or something? > > I dunno. I'll leave this as-is for now and come back to it. Either catlee's > patch will fix it, I'll disable it, or I'll flip the platform level flag to > make it MBF again. we can fix mulet at the same time! ;) > > you're not locking this platforms dict and the only other reason we would > > want to do this is to add branch-platform specific stuff but your > > "linux64_graphene" is empty {} > > This is leftover from what I thought I could add the platform here to enable > it just for this branch. Turns out I had to add it to default list too (yay > =\), and then remove it for every other branch. Mind if I leave this in for > now, at least until I figure out if I'll be disabling other platforms (at > which time I'd need to set lock_platforms). coola boola thanks for bearing all those questions. ship it!
Assignee | ||
Updated•8 years ago
|
Attachment #8567940 -
Flags: checked-in+
Assignee | ||
Updated•8 years ago
|
Attachment #8567943 -
Flags: checked-in+
Assignee | ||
Updated•8 years ago
|
Attachment #8567944 -
Flags: checked-in+
Comment 24•8 years ago
|
||
In production: https://hg.mozilla.org/build/buildbot-configs/rev/80d1f4462ef1
Comment 25•8 years ago
|
||
In production: https://hg.mozilla.org/build/buildbotcustom/rev/51a1172a0351
Comment 26•8 years ago
|
||
In production: https://hg.mozilla.org/build/mozharness/rev/3a6c750da9a1
Comment 27•8 years ago
|
||
postrun.py doesn't like uploading this to ftp. bm71 failed with: OSError: [Errno 13] Permission denied: '/home/ftp/pub/graphene' looks like this[1] uses 'product' for the upload path and ftp isn't aware of graphene yet? I moved the dead items to /dev/shm/queue/commands/bhearsum_bup/ on bm71 in case you want to look at full traceback. [1] http://mxr.mozilla.org/build/source/tools/stage/post_upload.py#236
Assignee | ||
Comment 28•8 years ago
|
||
(In reply to Jordan Lund (:jlund) from comment #27) > postrun.py doesn't like uploading this to ftp. > > bm71 failed with: OSError: [Errno 13] Permission denied: > '/home/ftp/pub/graphene' > > looks like this[1] uses 'product' for the upload path and ftp isn't aware of > graphene yet? > > I moved the dead items to /dev/shm/queue/commands/bhearsum_bup/ on bm71 in > case you want to look at full traceback. > > [1] http://mxr.mozilla.org/build/source/tools/stage/post_upload.py#236 Whoops, my bad. This should be an easy fix, I'll get a patch together.
Assignee | ||
Comment 29•8 years ago
|
||
In an ideal world we'd create a graphene namespace on the ftp server. In reality, we're moving to S3 before EOY, so it's not worthwhile. Better just to upload to b2g for now. These files will have "graphene" instead of "b2g" in their name, so they shouldn't collide with any existing stuff.
Attachment #8569160 -
Flags: review?(jlund)
Assignee | ||
Comment 30•8 years ago
|
||
Attachment #8569201 -
Flags: review?(jlund)
Assignee | ||
Comment 31•8 years ago
|
||
No need to parse this l10n block if l10n is disabled!
Attachment #8569202 -
Flags: review?(jlund)
Assignee | ||
Comment 32•8 years ago
|
||
I think this should be it for round two....there will probably be additional changes after the first round of nightlies, though.
Attachment #8569160 -
Attachment is obsolete: true
Attachment #8569160 -
Flags: review?(jlund)
Attachment #8569203 -
Flags: review?(jlund)
Comment 33•8 years ago
|
||
Comment on attachment 8569203 [details] [diff] [review] add mac+windows; enable nightlies; fix stage product Review of attachment 8569203 [details] [diff] [review]: ----------------------------------------------------------------- sanity check: you reverted back to graphene as stage_product. Does that mean we fixed things from ftp side?
Assignee | ||
Comment 34•8 years ago
|
||
(In reply to Jordan Lund (:jlund) from comment #33) > Comment on attachment 8569203 [details] [diff] [review] > add mac+windows; enable nightlies; fix stage product > > Review of attachment 8569203 [details] [diff] [review]: > ----------------------------------------------------------------- > > sanity check: you reverted back to graphene as stage_product. Does that mean > we fixed things from ftp side? Nope..that means I screwed up this patch.
Comment 35•8 years ago
|
||
Comment on attachment 8569203 [details] [diff] [review] add mac+windows; enable nightlies; fix stage product Review of attachment 8569203 [details] [diff] [review]: ----------------------------------------------------------------- ::: mozilla/b2g_config.py @@ +1614,5 @@ > + ], > + "mozharness_desktop_build": { > + "script_name": "scripts/fx_desktop_build.py", > + "extra_args": [ > + "--config", "builds/releng_base_win_64_builds.py", win64 builds are still using MBF. I have enabled win64 mh on cedar but I haven't gone back and worked out the kinks. This might not work right away but I'll take a look at win64 mh today.
Comment 36•8 years ago
|
||
Comment on attachment 8569201 [details] [diff] [review] mozharness configs for mac and windows Review of attachment 8569201 [details] [diff] [review]: ----------------------------------------------------------------- we should overwrite mh config['stage_product'] too. fx_desktop_build.py was made for ff: http://mxr.mozilla.org/build/source/mozharness/scripts/fx_desktop_build.py#65 I made the assumption at the time if we were to have a new product, we would need a new script. Quickest solution is to overwrite default value: line 65 above into your configs/build/$PLATFORM/64_graphene.py configs You might ask, why bbot-cfgs requires 'stage_product' then if mh has its own. Unfortunately, postrun.py uses buildbot properties, in this case 'product', to determine ftp path: http://mxr.mozilla.org/build/source/buildbotcustom/misc.py#1003 postrun.py could have been patched to support two ways to determine ftp path but my end goal was to get this working :)
Assignee | ||
Comment 37•8 years ago
|
||
(In reply to Jordan Lund (:jlund) from comment #35) > Comment on attachment 8569203 [details] [diff] [review] > add mac+windows; enable nightlies; fix stage product > > Review of attachment 8569203 [details] [diff] [review]: > ----------------------------------------------------------------- > > ::: mozilla/b2g_config.py > @@ +1614,5 @@ > > + ], > > + "mozharness_desktop_build": { > > + "script_name": "scripts/fx_desktop_build.py", > > + "extra_args": [ > > + "--config", "builds/releng_base_win_64_builds.py", > > win64 builds are still using MBF. I have enabled win64 mh on cedar but I > haven't gone back and worked out the kinks. This might not work right away > but I'll take a look at win64 mh today. I don't think you need to rush anything on my account. If it doesn't work, I'm happy to help fight through it. (In reply to Jordan Lund (:jlund) from comment #36) > Comment on attachment 8569201 [details] [diff] [review] > mozharness configs for mac and windows > > Review of attachment 8569201 [details] [diff] [review]: > ----------------------------------------------------------------- > > we should overwrite mh config['stage_product'] too. I'll fix this. Thanks for catching it.
Assignee | ||
Comment 38•8 years ago
|
||
Attachment #8569201 -
Attachment is obsolete: true
Attachment #8569203 -
Attachment is obsolete: true
Attachment #8569201 -
Flags: review?(jlund)
Attachment #8569203 -
Flags: review?(jlund)
Attachment #8569337 -
Flags: review?(jlund)
Assignee | ||
Comment 39•8 years ago
|
||
Attachment #8569339 -
Flags: review?(jlund)
Comment 40•8 years ago
|
||
Comment on attachment 8569202 [details] [diff] [review] rework a bit more misc.py logic to avoid more variables Review of attachment 8569202 [details] [diff] [review]: ----------------------------------------------------------------- ::: misc.py @@ -1238,5 @@ > > if do_nightly: > builder = '%s nightly' % base_name > - l10n_builder = '%s %s %s l10n nightly' % ( > - pf['product_name'].capitalize(), name, platform I think it's just 'product_name' that we are having a problem with here. @@ +1241,2 @@ > nightlyBuilders.append(builder) > + if config["enable_l10n"]: massimo will know more but I think 'enable_l10n' represents only if we have l10n enabled for a branch and buildbot is doing it. He has is_l10n_with_mh() if branch has l10n enabled and mh is doing it[1]. iow (not pf['enable_l10n']) != (not is_l10n_with_mh()). [1] http://mxr.mozilla.org/build/source/buildbotcustom/misc.py#3272
Updated•8 years ago
|
Attachment #8569337 -
Flags: review?(jlund) → review+
Updated•8 years ago
|
Attachment #8569339 -
Flags: review?(jlund) → review+
Assignee | ||
Comment 41•8 years ago
|
||
(In reply to Jordan Lund (:jlund) from comment #40) > Comment on attachment 8569202 [details] [diff] [review] > rework a bit more misc.py logic to avoid more variables > > Review of attachment 8569202 [details] [diff] [review]: > ----------------------------------------------------------------- > > ::: misc.py > @@ -1238,5 @@ > > > > if do_nightly: > > builder = '%s nightly' % base_name > > - l10n_builder = '%s %s %s l10n nightly' % ( > > - pf['product_name'].capitalize(), name, platform > > I think it's just 'product_name' that we are having a problem with here. Right. But why are we filling out l10n args and other crap when l10n is not even enabled? > @@ +1241,2 @@ > > nightlyBuilders.append(builder) > > + if config["enable_l10n"]: > > massimo will know more but I think 'enable_l10n' represents only if we have > l10n enabled for a branch and buildbot is doing it. He has is_l10n_with_mh() > if branch has l10n enabled and mh is doing it[1]. iow (not > pf['enable_l10n']) != (not is_l10n_with_mh()). > > [1] http://mxr.mozilla.org/build/source/buildbotcustom/misc.py#3272
Assignee | ||
Comment 42•8 years ago
|
||
(In reply to Ben Hearsum [:bhearsum] from comment #41) > (In reply to Jordan Lund (:jlund) from comment #40) > > Comment on attachment 8569202 [details] [diff] [review] > > rework a bit more misc.py logic to avoid more variables > > > > Review of attachment 8569202 [details] [diff] [review]: > > ----------------------------------------------------------------- > > > > ::: misc.py > > @@ -1238,5 @@ > > > > > > if do_nightly: > > > builder = '%s nightly' % base_name > > > - l10n_builder = '%s %s %s l10n nightly' % ( > > > - pf['product_name'].capitalize(), name, platform > > > > I think it's just 'product_name' that we are having a problem with here. > > Right. But why are we filling out l10n args and other crap when l10n is not > even enabled? > > > @@ +1241,2 @@ > > > nightlyBuilders.append(builder) > > > + if config["enable_l10n"]: > > > > massimo will know more but I think 'enable_l10n' represents only if we have > > l10n enabled for a branch and buildbot is doing it. He has is_l10n_with_mh() > > if branch has l10n enabled and mh is doing it[1]. iow (not > > pf['enable_l10n']) != (not is_l10n_with_mh()). > > > > [1] http://mxr.mozilla.org/build/source/buildbotcustom/misc.py#3272 OK, I ran dump master and I think I understand what you're saying now: -<buildbot.schedulers.triggerable.Triggerable> {'builderNames': ['Linux ash nightly l10n 1/3', - 'Linux ash nightly l10n 2/3', - 'Linux ash nightly l10n 3/3'], - 'name': 'Linux ash nightly l10n', - 'properties': {'scheduler': 'Linux ash nightly l10n'}} -<buildbot.schedulers.triggerable.Triggerable> {'builderNames': ['Linux x86-64 ash nightly l10n 1/3', - 'Linux x86-64 ash nightly l10n 2/3', - 'Linux x86-64 ash nightly l10n 3/3'], - 'name': 'Linux x86-64 ash nightly l10n', - 'properties': {'scheduler': 'Linux x86-64 ash nightly l10n'}} -<buildbot.schedulers.triggerable.Triggerable> {'builderNames': ['OS X 10.7 ash nightly l10n 1/3', - 'OS X 10.7 ash nightly l10n 2/3', - 'OS X 10.7 ash nightly l10n 3/3'], - 'name': 'OS X 10.7 ash nightly l10n', - 'properties': {'scheduler': 'OS X 10.7 ash nightly l10n'}} Which sucks, but here we are. I'll fix my patch.
Assignee | ||
Comment 43•8 years ago
|
||
Attachment #8569202 -
Attachment is obsolete: true
Attachment #8569202 -
Flags: review?(jlund)
Attachment #8569949 -
Flags: review?(jlund)
Comment 44•8 years ago
|
||
Comment on attachment 8569949 [details] [diff] [review] fix buildbotcustom logic Review of attachment 8569949 [details] [diff] [review]: ----------------------------------------------------------------- I guess l10n_builder isn't used in the mh l10n case and go inside this block: http://mxr.mozilla.org/build/source/buildbotcustom/misc.py#1259 this works too.
Attachment #8569949 -
Flags: review?(jlund) → review+
Assignee | ||
Updated•8 years ago
|
Attachment #8569949 -
Flags: checked-in+
Assignee | ||
Updated•8 years ago
|
Attachment #8569339 -
Flags: checked-in+
Assignee | ||
Updated•8 years ago
|
Attachment #8569337 -
Flags: checked-in+
Reporter | ||
Updated•8 years ago
|
Attachment #8570508 -
Flags: review? → review+
Assignee | ||
Updated•8 years ago
|
Attachment #8570508 -
Flags: checked-in+
Assignee | ||
Comment 46•8 years ago
|
||
The nightly builds got as far as trying to upload symbols...
Attachment #8570540 -
Flags: review?(jlund)
Assignee | ||
Comment 47•8 years ago
|
||
In production: https://hg.mozilla.org/build/buildbot-configs/rev/e218dfcb8398
Comment 48•8 years ago
|
||
Comment on attachment 8570540 [details] [diff] [review] add symbol server stuff ship it!
Attachment #8570540 -
Flags: review?(jlund) → review+
Assignee | ||
Updated•8 years ago
|
Attachment #8570540 -
Flags: checked-in+
Assignee | ||
Comment 49•8 years ago
|
||
Graphene CI+nightly builders for Linux64, Mac, and Win64 now exist, in varying states of working: * Mac builds correctly, but something seems broken with MARs. It creates one, but it doesn't get uploaded and the Balrog submission fails. This suggests an issue in package-name.mk * Windows builds fine but fails when trying to run l10n-check. This step needs to be disabled, it's not applicable here. The log is weird, the mozconfig it shows doesn't match the one that I see in the repo. Working through this currently.
Assignee | ||
Comment 50•8 years ago
|
||
Looks like the complete MAR generation actually fails on Linux/Mac: 11:04:45 INFO - /builds/slave/l-osx64_graphene-ntly-00000000/build/src/tools/update-packaging/make_full_update.sh: line 110: /builds/slave/l-osx64_graphene-ntly-00000000/build/src/obj-graphene/dist/host/bin/mar: No such file or directory 11:04:45 INFO - mv: rename /builds/slave/l-osx64_graphene-ntly-00000000/build/src/obj-graphene/dist/graphene/B2G.app.work/output.mar to ../../dist/update/graphene-39.0a1.en-US.mac64.complete.mar: No such file or directory 11:04:45 INFO - Finished So, the mar binary is missing. The second error probably will be fixed by getting the mar binary built. I don't quite understand why it's not, given: 08:50:47 INFO - export MOZ_AUTOMATION_UPDATE_PACKAGING=1 Windows will need additional help here too, it's not even got MOZ_AUTOMATION_UPDATE_PACKAGING set: 10:43:33 INFO - export MOZ_AUTOMATION_UPDATE_PACKAGING=0 And Windows is also failing at package-tests: 11:32:53 INFO - find -L dist/test-stage -name '*.pyc' -exec rm {} \; 11:32:53 INFO - find: Filesystem loop detected; `dist/test-stage/jsreftest/tests/ecma_3_1' is part of the same filesystem loop as `dist/test-stage'. 11:32:53 INFO - c:/builds/moz2_slave/l-w64_graphene-ntly-0000000000/build/src/testing/testsuite-targets.mk:418: recipe for target 'package-tests' failed 11:32:53 INFO - mozmake.EXE[1]: *** [package-tests] Error 1 11:32:53 INFO - mozmake.EXE[1]: Leaving directory 'c:/builds/moz2_slave/l-w64_graphene-ntly-0000000000/build/src/obj-firefox' 11:32:53 INFO - c:/builds/moz2_slave/l-w64_graphene-ntly-0000000000/build/src/build/moz-automation.mk:128: recipe for target 'automation/package-tests' failed
Assignee | ||
Comment 51•8 years ago
|
||
(In reply to Ben Hearsum [:bhearsum] from comment #50) > Looks like the complete MAR generation actually fails on Linux/Mac: > 11:04:45 INFO - > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/tools/update- > packaging/make_full_update.sh: line 110: > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/obj-graphene/dist/ > host/bin/mar: No such file or directory > 11:04:45 INFO - mv: rename > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/obj-graphene/dist/ > graphene/B2G.app.work/output.mar to > ../../dist/update/graphene-39.0a1.en-US.mac64.complete.mar: No such file or > directory > 11:04:45 INFO - Finished > > So, the mar binary is missing. The second error probably will be fixed by > getting the mar binary built. I don't quite understand why it's not, given: > 08:50:47 INFO - export MOZ_AUTOMATION_UPDATE_PACKAGING=1 I think this is because MOZ_UPDATER gets unset in the branding files. Based on some vcs digging, it looks like this was a straight copy from Android XUL, which disabled the updater because it had its own. I doubt we want it here, so I removed it in https://hg.mozilla.org/projects/larch/rev/7a74f46734c2.
Assignee | ||
Comment 52•8 years ago
|
||
(In reply to Ben Hearsum [:bhearsum] from comment #51) > (In reply to Ben Hearsum [:bhearsum] from comment #50) > > Looks like the complete MAR generation actually fails on Linux/Mac: > > 11:04:45 INFO - > > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/tools/update- > > packaging/make_full_update.sh: line 110: > > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/obj-graphene/dist/ > > host/bin/mar: No such file or directory > > 11:04:45 INFO - mv: rename > > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/obj-graphene/dist/ > > graphene/B2G.app.work/output.mar to > > ../../dist/update/graphene-39.0a1.en-US.mac64.complete.mar: No such file or > > directory > > 11:04:45 INFO - Finished > > > > So, the mar binary is missing. The second error probably will be fixed by > > getting the mar binary built. I don't quite understand why it's not, given: > > 08:50:47 INFO - export MOZ_AUTOMATION_UPDATE_PACKAGING=1 > > I think this is because MOZ_UPDATER gets unset in the branding files. Based > on some vcs digging, it looks like this was a straight copy from Android > XUL, which disabled the updater because it had its own. I doubt we want it > here, so I removed it in > https://hg.mozilla.org/projects/larch/rev/7a74f46734c2. This didn't do the trick. MOZ_UPDATER is still empty.
Assignee | ||
Comment 53•8 years ago
|
||
(In reply to Ben Hearsum [:bhearsum] from comment #52) > (In reply to Ben Hearsum [:bhearsum] from comment #51) > > (In reply to Ben Hearsum [:bhearsum] from comment #50) > > > Looks like the complete MAR generation actually fails on Linux/Mac: > > > 11:04:45 INFO - > > > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/tools/update- > > > packaging/make_full_update.sh: line 110: > > > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/obj-graphene/dist/ > > > host/bin/mar: No such file or directory > > > 11:04:45 INFO - mv: rename > > > /builds/slave/l-osx64_graphene-ntly-00000000/build/src/obj-graphene/dist/ > > > graphene/B2G.app.work/output.mar to > > > ../../dist/update/graphene-39.0a1.en-US.mac64.complete.mar: No such file or > > > directory > > > 11:04:45 INFO - Finished > > > > > > So, the mar binary is missing. The second error probably will be fixed by > > > getting the mar binary built. I don't quite understand why it's not, given: > > > 08:50:47 INFO - export MOZ_AUTOMATION_UPDATE_PACKAGING=1 > > > > I think this is because MOZ_UPDATER gets unset in the branding files. Based > > on some vcs digging, it looks like this was a straight copy from Android > > XUL, which disabled the updater because it had its own. I doubt we want it > > here, so I removed it in > > https://hg.mozilla.org/projects/larch/rev/7a74f46734c2. > > This didn't do the trick. MOZ_UPDATER is still empty. Got this to work finally, with mshal's help. Still a couple of issues to work through with Balrog submission, but we're getting there.
Assignee | ||
Comment 54•8 years ago
|
||
Things are progress here. The Mac and Linux builds are more or less working now AFAICT, except for trying to sendchange when they shouldn't. I pushed a fix for that so the next ones should be green. Once those are done we should be able to test updates. Windows is busted for what appears to be a general issue with mach+mozharness+win64, I'm trying to debug that with jlund and mshal's help.
Assignee | ||
Comment 55•8 years ago
|
||
OK, things are going pretty well here now. Linux and Mac both have fully working nightly builds in terms of build process. Windows is coming along too. Some things that still need to be addressed: * Fix Treeherder to show jobs better. The Mac and Windows builds have an ugly row name. Linux reports in as "Linux x64 opt", which conflicts with Firefox jobs. * Fix Windows packaging. The latest build failed when trying to run "make installer". I've disabled that for now, but I'm pretty sure the complete MAR will fail to build without an installer. I suspect we'll need to create some sort of installer for Graphene to make this work. If that's the case I'm going to need someone else to have a look at that part...it's way outside my comfort zone. * Test that updates work. The update server appears to be serving valid information for Linux and Mac now, but I don't know how to test on the client side.
Assignee | ||
Comment 56•8 years ago
|
||
(In reply to Ben Hearsum [:bhearsum] from comment #55) > * Fix Treeherder to show jobs better. The Mac and Windows builds have an > ugly row name. Linux reports in as "Linux x64 opt", which conflicts with > Firefox jobs. Turns out that this is just waiting on my previous Treeherder patches to be pushed to prod. It all looks good on https://treeherder.allizom.org/#/jobs?repo=larch&exclusion_profile=false.
Assignee | ||
Comment 57•8 years ago
|
||
I'm going to call this fixed now. The builds we want are not 100% working, but this bug is big enough that it's time to deal with remaining issues in follow-ups. (In reply to Ben Hearsum [:bhearsum] from comment #55) > * Fix Windows packaging. The latest build failed when trying to run "make > installer". I've disabled that for now, but I'm pretty sure the complete MAR > will fail to build without an installer. I suspect we'll need to create some > sort of installer for Graphene to make this work. If that's the case I'm > going to need someone else to have a look at that part...it's way outside my > comfort zone. Looks like I'm right about this - MARs fail to build when the installer is disabled. I filed bug 1139944 to track fixing this. > * Test that updates work. The update server appears to be serving valid > information for Linux and Mac now, but I don't know how to test on the > client side. Fabrice told me on IRC that he wants to test his patches locally before pushing them to Larch. If we have any issues we'll file follow-ups.
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → FIXED
Updated•5 years ago
|
Component: General Automation → General
You need to log in
before you can comment on or make changes to this bug.
Description
•