change how automation does l10n repacks

RESOLVED FIXED

Status

Release Engineering
General
P2
normal
RESOLVED FIXED
10 years ago
5 years ago

People

(Reporter: joduinn, Assigned: armenzg)

Tracking

Firefox Tracking Flags

(Not tracked)

Details

Attachments

(4 attachments, 10 obsolete attachments)

33.38 KB, application/x-bzip2
Details
11.51 KB, text/plain
Details
14.22 KB, text/plain
Details
82.60 KB, image/jpeg
Details
This bug is to track some work items that came up in conf call with Axel, Armen and myself just now.

The file shipped-locales is currently used for archival purpose, and also by the build processes to decide what locales to repack/push/announce. The shipped-locales file tells us what locales we shipped in a given release. Unknown how we currently track if a given locale is "release" or "beta". Regardless of rest of this bug, we'll continue to manually update shipped-locales for every release, to keep an archive of what locales were in what release. This is unchanged from how we already do things now.

1) Change build processes to start using all-locales (not shipped-locales!!) to generate locale repacks for each nightly/clobber build. Possibly changing all-locales to add specific platform exclusions, for example, 
 gu-IN linux win32
 ja linux win32
 ja-JP-mac osx
...but that might require tinderbox changes. Needs investigation. 

2) Do each locale repack as independent repack. Currently for release automation, we do all repacks for each locale, then do all push for each locale, then announce all repacks. Note that all repacks are sequence independent of each other. Proposal is to change process so each repack is self-contained repack-push-announce step. This means: 
- each repack can be done on different slaves, in parallel if slaves are available. 
- we can repack a newly added locale without having to repack every locale in the release. 
- failure of one repack will not block other later repacks from happening.
- this brings nightly repacks and release automation repacks closer in sync with each other.
Assignee: nobody → armenzg
Priority: -- → P2
Just to be clear: this is bug is about l10n on 1.9 (and maybe 1.8?), not Mozilla2/3.next, right?

Comment 2

10 years ago
Yes, building 3.next is bug 434289. There are a few loose ends for that, so we'd rather not have Armen get into that as first thing in the morning.

Depending on how things go, we'd not start actually doing builds for mozilla-central until we made some significant progress here. Trying to make this one block that one, I'd say it's a soft block only.
Blocks: 434289
(In reply to comment #0)
> 1) Change build processes to start using all-locales (not shipped-locales!!) 

Tinderbox actually uses all-locales for a l10n run, so we get that behaviour for nightlies and when we wrap it with Bootstrap & Buildbot for releases.

http://bonsai.mozilla.org/cvsblame.cgi?file=/mozilla/tools/tinderbox/post-mozilla-rel.pl&rev=1.146&mark=838-844,853#814
 
> 2) Do each locale repack as independent repack. Currently for release
> automation, we do all repacks for each locale, then do all push for each
> locale, then announce all repacks.

I'd break this down a little more. Tinderbox loops over the values in all-locales: attempting to build each locale, and copying the results to a staging dir; then it uploads everything to a dated dir (YYYY-MM-DD-$app$version). The release automation copies from the dated dir to the candidates directory, and then makes the email announcement.

> Note that all repacks are sequence
> independent of each other. Proposal is to change process so each repack is
> self-contained repack-push-announce step. This means: 
> - each repack can be done on different slaves, in parallel if slaves are
> available. 

This could be nice speed up on windows, our slowest platform (again!). I/O is a problem there and we still generate a zip file even though we no longer ship them, so it's a double whammy. We can drop making zips when rewriting this, the exe installer and complete update is sufficient (for all platforms).

> - failure of one repack will not block other later repacks from happening.

This isn't a with problem tinderbox now - it's like a modern set of Christmas lights rather than old school. ;-) I guess tinderbox could break part way through building the locales and not upload what it had done, but I can't think of any occasions of that happening.

> - this brings nightly repacks and release automation repacks closer in sync
> with each other.

Not sure I grok this one. The release automation wraps tinderbox so the l10n build is unchanged, it just copies the files from dated dir to candidates dir afterwards. Do you mean that releases and nightlies could both be driven by Buildbot ?

Don't get me wrong here, I think getting out from the tinderbox code is worthwhile. And if we bring over some of Axel's improvments (like only building locales with changes during the day) then are big wins here.

Comment 4

10 years ago
(In reply to comment #3)
The all-fail condition actually kicks in on mar generation or updates. Not sure which. I don't know when we're starting to use shipped-locales instead of all-locales inside release automation.

Does l10n tinderbox actually upload to a dated dir? I know they're not dated on ftp, which is why the download links on tinderbox are all broken.

Untangling the windows zips from the installers is a different bug than this one, as that is really per-app logic in $(app)s/locales/Makefile.in. This one should hardly have to touch that file, hopefully.

Comment 5

10 years ago
Some things that haven't been in the initial comment, but should be noted:

If we want to be able to distribute l10n repacks on multiple slaves, we have to move the information on which locales to build out of the slave (tinderbox or buildslave) into the buildmaster, more precisely, the scheduler. This gives quite a bit of power to re-use builders and other things for the various tasks, which is what I'm doing on the l10n server right now, and what is more detailed on http://wiki.mozilla.org/User:AxelHecht/Building_l10n, including failed attempts to do so.
(In reply to comment #3)
> (In reply to comment #0)
> > 1) Change build processes to start using all-locales (not shipped-locales!!) 
> 
> Tinderbox actually uses all-locales for a l10n run, so we get that behaviour
> for nightlies and when we wrap it with Bootstrap & Buildbot for releases.
>

This made me wonder: how feasible would it be to simply make Bootstrap::Repack support 'nightly' mode (the same way we made TinderConfig and Build).
(In reply to comment #1)
> Just to be clear: this is bug is about l10n on 1.9 (and maybe 1.8?), not
> Mozilla2/3.next, right?

Yes, this is for 1.9, ideally 1.8 also, but lets see how it goes. 

There are too many changes going on with l10n situation on Mozilla2/3.next
right now. After l10n on Mozilla2/3.next stabilizes a bit, we can look at doing
this same locale-repack-work over to Mozilla2/3.next, but I didnt want this
work to get caught up in the churn.


(In reply to comment #2)
> Yes, building 3.next is bug 434289. There are a few loose ends for that, so
> we'd rather not have Armen get into that as first thing in the morning.
> 
> Depending on how things go, we'd not start actually doing builds for
> mozilla-central until we made some significant progress here. Trying to make
> this one block that one, I'd say it's a soft block only.
Actually, my understanding was that bug#434289 blocks doing bug#434878 on
mercurial. (ie, we cant optimize l10n builds on mercurial until after we have
working l10n builds on mercurial!). Obviously, bug#434289 does not block us
doing anything on cvs, so we can start work there now.
No longer blocks: 434289
Depends on: 434289
(In reply to comment #6)
> (In reply to comment #3)
> > (In reply to comment #0)
> 
> This made me wonder: how feasible would it be to simply make Bootstrap::Repack
> support 'nightly' mode (the same way we made TinderConfig and Build).

...or even wonder: how feasible would it be to implement this without using Bootstrap or Tinderbox at all? Just having Buildbot calling Makefile targets seems appealing...

Comment 9

10 years ago
... makes me ask... How do we create full and incremental mars?
(In reply to comment #4)
> (In reply to comment #3)
> The all-fail condition actually kicks in on mar generation or updates. Not sure
> which. 

Can you point to the code for this ?

> I don't know when we're starting to use shipped-locales instead of
> all-locales inside release automation.

We use shipped-locales to work out which files to keep and throw away when staging, and in setting up update verification. We also only tag the shipped locales, so that might have to change if you want to build everything in all-locales for a release. Not sure I see the value in that though.

> Does l10n tinderbox actually upload to a dated dir? I know they're not dated on
> ftp, which is why the download links on tinderbox are all broken.

In the tinder-config.pl for nightlies we set
 $ReleaseToLatest = 1; # Push the release to latest-<milestone>?
 $ReleaseToDated = 0; # Push the release to YYYY-MM-DD-HH-<milestone>?
to get files into latest-mozill1.9.0 but not a dated dir. It's the opposite for releases, giving the likes of 
http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/2008/05/2008-05-12-03-firefox3.0rc1-l10n/

Comment 11

10 years ago
What happened when Punjabi on windows failed for fx3b? ? I don't recall which beta it was, but the lack of a windows punjabi build turned out to be fatal.

Anyone remember? I can't reproduce my memories from looking at shipped-locales.
(In reply to comment #8)
> (In reply to comment #6)
> > (In reply to comment #3)
> > > (In reply to comment #0)
> > 
> > This made me wonder: how feasible would it be to simply make Bootstrap::Repack
> > support 'nightly' mode (the same way we made TinderConfig and Build).
> 
> ...or even wonder: how feasible would it be to implement this without using
> Bootstrap or Tinderbox at all? Just having Buildbot calling Makefile targets
> seems appealing...
> 

Yeah, that's the ideal case. But the reason we went with a 'nightly' mode for builds on 1.8 is because it was very low cost. I think we may have a similar situation here.
(In reply to comment #9)
> ... makes me ask... How do we create full and incremental mars?

Full mars for both releases and nightlies get created by the tinderbox via call to create_update_package(), the main bit of work is at 
http://mxr.mozilla.org/seamonkey/source/tools/tinderbox/post-mozilla-rel.pl#664

Partial mars are created separately. We haven't done nightly updates for 1l0n at all; for en-US there's a perl script that generates them and the update snippets for AUS. For releases, we use a second perl script (mozilla/tools/patcher/patcher2.pl).

Comment 14

10 years ago
(In reply to comment #12)
> (In reply to comment #8)
<...>
> > ...or even wonder: how feasible would it be to implement this without using
> > Bootstrap or Tinderbox at all? Just having Buildbot calling Makefile targets
> > seems appealing...
> > 
> 
> Yeah, that's the ideal case. But the reason we went with a 'nightly' mode for
> builds on 1.8 is because it was very low cost. I think we may have a similar
> situation here.

There's going to be a stop-gap situation when doing "cheap" isn't going to work anymore. We don't want to see build being a blocker in taking new locales, so we'll need to change stuff. Like, there's another 20% increment in locale code around the corner.

Note, this is still going to be "cheap", as I did quite a bit of it already :-)

I wonder if we should have a brownbag about this, like a call with the wider build group, if there's interest. John is likely not too fond of the idea of spending yet another hour on a meeting on this, but maybe it's a good idea.

John?
I was going to write a long reply trying to reply most comments, but I will rather just try 

a) The build side of repackaging is not difficult (checkouts, make configure and make installers-%). It is a matter of following build-seamonkey-util.pl and post-mozilla-rel.pl. There are configuration values in the tinder-config.pl and the mozconfig files. My script in my blog post captures some of it

b) Fixing "sub packit_l10n", don't seem to me a long run fix

c) Things could be done without Bootstrap from looking at the code and the logs, BUT there is a lot of overhead related to notifying the right status to tinderbox that I will have to deal with. I think aside of repackaging I could have a Buildbot step to call the right "sub mail_" functions that live currently in the post-mozilla-rel.pl

d) My _BIGGEST_ concern is on how to give a slave a notification that it has to do a repackage and receive a parameter that will have the $locale name. Therefore, I will be studying during this week Buildbot and Python to be at a level in which, at least, I can have a discussion about our different options without going mostly everything over my head.

c) When I reach that level, I can actually try to extract juice or ideas out of Axel's code and attempts. I understand slightly the code but it currently goes over my head 

I have spent most of my days trying to understand the l10n repackaging process (what happens between build-seamonkey and post-mozilla-rel), thinking of different places where fix it and to be knowledgeable in the "build" side of the problem. The build side could be optimized but that will follow later.
I have been reading the discussion and I will be open to read more concerns and suggestion but for these next days, I will spend most of my time learning and probably poking different people to get the knowledge that I lack.
Status: NEW → ASSIGNED
Working on this should also tackle fixing picking up the right en-US source for the build.
After an en-US build, l10n repackages should happen and compare-locales should be using the same source as the one that en-US used. 
We don't want compare-locales to be using the en-US latest source code but the source code used for the en-US build.
Blocks: 398954
Created attachment 323623 [details]
Automates seing the proof-of-concept being run

With this buildbot setup, I try to reach one of my goals that was to understand who-calls-who in buildbot.
This is a custom made Periodic Scheduler with a Build class, all inspired from Axel's code. I have separated my classes into "l10nArmen.py" to make it easy to realize what I have added.

  1) There are 10 locales hard-coded to the scheduler
  2) When the time comes there are 10 build requests (or build description) that cannot be merged
  3) From these "build descriptions", we set the property "locale" to a Build
  4) The step use WithProperties to pull the property locale from the Build

NOTE: I might have used the names of objects inappropriately but should sound more or less right

More to come...
Created attachment 323624 [details]
This master.cfg runs 10 non-mergable build requests that have "locale" as one of its propeties

Updated

10 years ago
Attachment #323624 - Attachment mime type: application/octet-stream → text/plain
Created attachment 323626 [details]
l10nArmen.oy - Contains PeriodicL10n and Build class

- The factory gets the Build custom-made class as its .buildClass
- The build's buildQueue is the actual PeriodicL10n scheduler, which has a pop() method
- The doPeriodicBuild creates the objects inserted in the queue with their .locale attribute set
There might be in my code some things left behind like the list of locales in the master.cfg which is also in the PeriodicL10n scheduler.

These attachments will help to see some problems that I am going to face.
For instance, one of the problems are the:
 * hard coded locales
1) I am planning on using "buildbot sendchange" in a hacky way to have a list of locales (could be one, could be many) and see where I can get. 

2) If it works I would make a slave do some "thinking" and decide which locales to put in that list and use "sendchange"

There are other options that I have considered which I want to draw and attach rather than typing it

Comment 21

10 years ago
I don't think that slaves should carry any logic on what to build and when. In particular, there's no guarantee that at any particular point in time, there's even a slave connected.

I think that the attachments 2 and 3 nicely show how the scheduler needs to know about what to do.

I see two ways to go from here, configure locales from configuration data, or build-on-push. There's a third component in the stuff that I have, which is to actually re-use the builder across apps and branches, but that doesn't look like a good next step here.
(In reply to comment #21)
> I don't think that slaves should carry any logic on what to build and when. In
> particular, there's no guarantee that at any particular point in time, there's
> even a slave connected.
I do agree that is not appropriate
 
> I think that the attachments 2 and 3 nicely show how the scheduler needs to
> know about what to do.
Maybe I could read the locale's list from a file instead of hard-code it there. The file could be read upon "reconfig"
What do you think of a slave asking the master to reconfigure itself after realizing that the file has changed since last time (a new locale has been added to all-locales or whatever)
 
> I see two ways to go from here, configure locales from configuration data, or
> build-on-push. 
Could you expand? I don't know what build-on-push means

NOTE: Writing a blog post that people might want to read
Created attachment 323938 [details]
Master.cfg - It checks out the source code for each locale accoringly to all-locales (this is thanks to the PeriodicL10n scheduler)
Attachment #323624 - Attachment is obsolete: true
Created attachment 323940 [details]
l10nArmen.py - It checks outs all-locales and creates non-mergable builds

The following code does all the work. Please comments

LITTLE NOTE: l10nArmen.py is inside the folder builbotarmen instead of buildbotcustom

  def getLocales(self):
    """It checks out the all-locales file"""
    #os.system("cvs -d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot co mozilla/browser/locales/all-locales")
    tuple =  subprocess.Popen(
        ['cvs', '-d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot',
         'co', '-p', 'mozilla/browser/locales/all-locales'],
        stdout=subprocess.PIPE).communicate()
    list = tuple[0].split('\n')
    if list[-1]=='':
        list = list[0:-1] #get rid of the last one
    return list
  
  def doPeriodicBuild(self):
    if self.locales:
      locales = self.locales
    else:
      locales = self.getLocales()
    
    #look line 
    for locale in locales:
      obj = PeriodicL10n.BuildDesc(locale)
      #print obj.locale
      self.queue.insert(0, obj)
      bs = buildset.BuildSet(self.builderNames,
                              #SourceStamp(branch=self.branch),
                              PeriodicL10n.NoMergeStamp(branch=self.branch),
                              self.reason)
      self.submit(bs)
Attachment #323626 - Attachment is obsolete: true
Created attachment 324159 [details]
installdmg.ex - This is an expect script that gets called by line 160 of packager.mk - it has to be located in the home directory of the buildslave

This script just helps to mount an image in a mac (which is part of the repacking process
Created attachment 324160 [details]
packager.mk - Lines 159 and 160 modified from the original packager.mk file

This file gets downloaded by a slave (for now) and overwrites the file from cvs which gets checked out. This change just allows the process to go on without waiting a human to reply "YES" when asked to mount the .dmg file
Created attachment 324161 [details]
master.cfg - It checks out en-GB and repackages it (on mac)

The difference between this master.cfg and the previous one is that it focuses in the process of repackaging just one language (en-GB). The repackaging requires some more work (there are still some steps to be added) but in the beginning of this next week I should have all steps and I might try my own "fake" ftp server to push to.

Waiting for some feedback on comment 24 while I keep on moving ahead:

tuple =  subprocess.Popen(
        ['cvs', '-d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot',
         'co', '-p', 'mozilla/browser/locales/all-locales'],
        stdout=subprocess.PIPE).communicate()
Created attachment 324311 [details]
packager.mk - Fixes line 162's perl matching from MOUNTPOINT=$'/tmp/Minefield\r' to MOUNTPOINT=/tmp/Minefield
Attachment #324160 - Attachment is obsolete: true
Created attachment 324319 [details]
master.cfg - the most complete configuration which tries to split the work between three slaves

I want to use this comment to bring up certain information that I have hard coded in the master.cfg

|#NOTE: This name changes overtime - where to get what the latest naming is?
|dmg_name = "firefox-3.1a1pre.en-US.mac.dmg"
|#zipin_mac = "firefox.dmg"
|#QUESTION: Where is the line in the Makefile that decides that this is the file |needed?
|#Why didn't it take 3.1a1pre?? -->> More research
|zipin_mac = "firefox-3.0pre.en-US.mac.dmg"
|latest_mac_en_US ="http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/" \
|                  + "latest-trunk/" + dmg_name
|tag="l10n"

When I do the "make installers-%locale%", I believe I can specify the ZIP_IN value to use whichever naming I want, BUT I need to know what is the name of the latest-build's name. Where can I grab this info from?

-------------

Another issue I have to work on is the issue of having common code (which would be run unnecessarily) and then the repackaging code

-------------

I want to start wrapping things up into packages our modules under buildbotcustom to make things more presentable and to unify logically related steps
Attachment #323938 - Attachment is obsolete: true
Attachment #324161 - Attachment is obsolete: true
Depends on: 438436
Question:
- How many repackages of the SAME locale to be stored in "http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/" and "http://ftp.mozilla.org/pub/mozilla.org/firefox/tinderbox-builds/". Let's narrow it to: 1.9.0

Let's think of "af", we would have 2 copies:
1) tinderbox-builds/latest-mozilla1.9.0-l10n/
2) nightly/latest-mozilla1.9.0-l10n/

I assume this is the desired behavior and not what en-US does under "tinderbox-builds" which lists folder per machine and stores several binaries of en-US in what I have yet not made sense which logic it follows

NOTE: There are also scattered l10n folders in "nightly" for rc2, rc3 and 2.0.0.15 instead of being shown under the folder "release"

------------------
NOTE BUG: I don't know how hourly it is that today (Jun 13 11:38AM) we have under 1.8 in tinderbox-builds (firefox-2.0.0.15pre.af.linux-i686.tar.gz	12-Jun-2008 12:10) which is almost 24 hours
CONCERN:
We do dependent builds toooo fast (e.g. 6-8mins on linux) to do l10n repackages for every dependent en-US build.

Solutions:
a) After an en-US build a trigger step will be run which would start a Scheduler that will do the l10n repackages. While the repackaging of all locales happen, there would be few en-US builds that would happen and would run that triggering step and my desired behavior would be that since the scheduler has been trigered many times it would only run once, as if the trigger-requests were being merged, but I doubt that this is the behavior by default of buildbot
b) every 1 hour we would trigger repackages independently of when the en-US builds are being finished, therefore I would start the process without knowing what was the checkout time of that build since "cvscodate" property wouldn't be set
c) Do not allow to continue with an en-US build until all l10n-repackages are done. I totally doubt that this is what we want but that would only bring the benefit of having l10n repackages for *every* single dependent build we generate

Comment 32

10 years ago
If you know that you're repackaging the last successful build of builder FOO, you can probably just directly ask the status, and you don't have to trigger the scheduler at all.

What I'm concerned about would be flooding the slaves with l10n repacks so that we don't get depend builds going anymore.

Have we actually made a call on whether we want to repack the last depend build or the latest nightly?

I'm not sure, Ben, how far are you with your idle scheduler? That might be a good first step on the way to a build-on-push setup.
(In reply to comment #32)
> If you know that you're repackaging the last successful build of builder FOO,
> you can probably just directly ask the status, and you don't have to trigger
> the scheduler at all.
> 
How can I "ask the status"? 

Comment 34

10 years ago
If I could find a way to get to the master from a step, you could do

for b in master.getStatus().getBuilder('en-US dep builder').generateFinishedBuilds():
  if b successful, getProperty(), break.

Just that I don't see a way to get to the master. I get up as far as the BuilderStatus, but somehow that doesn't seem to have a parent. The process Builder object has the botmaster, but I can't see a way to get from status over to process.

Maybe Ben has better luck?
Created attachment 325484 [details]
buildbotcustom/l10n/scheduler.py - L10n custom made schedulers

In this file I have now classes as:
   1) PeriodicL10n(Periodic)
   2) TriggerableL10n(Triggerable)
   3) SchedulerL10n(Scheduler) #not yet
   4) DependentL10n(Dependent)

We also have the L10nHelper class that is used by all l10n schedulers.

The custom Build class is in another file (buildbotcustom/l10n/l10n.py) until I decide which is the correct naming for it.

------------

As I write the master.cfg and find out which schedulers I want to use, I will add small patches to review instead of the whole file.
Attachment #323940 - Attachment is obsolete: true
Created attachment 325485 [details]
master.cfg - 3 schedulers: Nightly, Dependent(getting ready) and DependentL10n

This master.cfg is not finished but it should work as this:

- The nightly scheduler starts
  * It should set the MOZ_CO_DATE
  * Clean en-US build happens using the MOZ_CO_DATE
- Dependent scheduler
  * Every single slave that will do an l10n repackage has its own builder ONLY for this scheduler
  1) make -f client.mk l10n-checkout using previously set MOZ_CO_DATE
  2) make -f client.mk configure
  3) make -c config
- Dependent l10n scheduler
  * Every locale will be handed in to an available slave
  * 3 builders, one per platform, but one or more slave per builder
  1) check out specified locale's code
  2) make installers for that locale
  3) Push & Announce

NOTE: I am skipping details or forgetting steps but nothing will be missing when the code is written and tested :P
Attachment #324319 - Attachment is obsolete: true
Depends on: 439778

Comment 37

10 years ago
Hrm, I think I see what you're doing, but I have a hard time paraphrasing it.

I think I see a dependency on having a once-per-day build run on all slaves to set up the workdir for the following l10n builds.

Kinda makes me wonder what happens when slaves go down, need kicking, maintainance and such.

I wonder if this would be way simpler if we hard-coded just repackaging the 4am nightly builds, which bug 439778 is about. Then we could just get the binaries of the keyed ftp dir, use a known timestamp, and set up anything required in the workdir on demand.

Another thought, I wonder if we could make the en-US builder tell the l10n builder that it did a new nightly, and where to find that.
(In reply to comment #37)
> Kinda makes me wonder what happens when slaves go down, need kicking,
> maintainance and such.
> 
Now I understand your wondering.
I will write what the problem that I have found
* Periodic scheduler for en-US builds
* Dependent scheduler to prepare for l10n repackages (one builder per slave)
* DependentL10n scheduler - one builder - many slaves

If in the Dependent scheduler one of the slaves does not finish the assigned work (it may be offline) then the DependentL10n scheduler will never start and I doubt that we would get any notification since nothing really fails

I am going to rethink the situation - the biggest problem is having multiple slaves with buildbot to do l10n repackages

> I wonder if this would be way simpler if we hard-coded just repackaging the 4am
> nightly builds, which bug 439778 is about. Then we could just get the binaries
> of the keyed ftp dir, use a known timestamp, and set up anything required in
> the workdir on demand.
> 
> Another thought, I wonder if we could make the en-US builder tell the l10n
> builder that it did a new nightly, and where to find that.
> 
If we find a way that would be awesome, but for now I am going to try setting the same MOZ_CO_DATE for en-US builds and the l10n repackages

Created attachment 326054 [details]
master.cfg - OneSlave

- This master.cfg runs after an en-US build - The problem is that I can't pass the cvscodate from one scheduler to the next one. Maybe a plain text could contain this, upload it to the master and download it by the slave in the preparatory set of steps. bhearsum mentioned that we don't have Scheduler properties in our buildbot 0.7.7

- About timing:
  * Preparatory builder takes ~2mins from scratch and ~1min when there was code previously
  * Each repackage take ~30secs, maybe less the locale's code has already been checked out before

PROBLEMS:
1) If we add more slaves to do l10n repackages they have to run the preparatory steps, this means for now adding a builder per slave involved in repackages. if one of the slaves disconnect the DependentL10n scheduler will never be run until that slave is back again and finished the steps assigned.
*Is there are a way to make all slaves of a builder to run a set of steps?*
2) MOZ_CO_DATE or cvscodate. We can't pass it from builder to builder. Maybe a text file could pass this info around
3) Nick mentioned that for release a TAG is set instead of a date. I will have a try and see how this goes. It might be useful to have buildbot class that would load all this info from a bootstrap.cfg file. 
4) I have hardcoded in the master.cfg this information: 
  dmg_name = "firefox-3.1a1pre.en-US.mac.dmg"
  zipin_mac = "firefox-3.0.1pre.en-US.mac.dmg"
  latest_mac_en_US ="http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/" \
                  + "latest-trunk/" + dmg_name
maybe that class for reading bootstrap.cfg might be useful after all
5) The reporting of each locale specific steps are few and therefore it fills up easily the whole page. I will try to create a class wrapping these steps together
6) It really bugs me that for mac I have to set this workdir="mozilla/objdir/ppc" instead of just "mozilla/objdir"
7) The "l10n_preparation" builder runs all of its steps in the "l10n_repackages"'s workdir

SCENARIOS:
1) Nightly. If we have a Nighlty scheduler for the en-US builds we can run the same cvs check out time by kind of hard coding the time. I will prove this in my next attempt

2) Release. We have to use a tag to checkout. I will test this

3) Dependent. The en-US dependent builds are way too fast, trying to catch up with l10n repackages would be difficult. We might want to run repackages every hour or so for dependent l10n repackages

NOTE: Remember that you need the packager.mk, the mozconfig-mac in your master and the installerdmg.ex in your home directory
Attachment #325485 - Attachment is obsolete: true
Created attachment 326055 [details]
waterfall screenshot showing the DependentL10n scheduler in action
Comment on attachment 324311 [details]
packager.mk - Fixes line 162's perl matching from MOUNTPOINT=$'/tmp/Minefield\r' to MOUNTPOINT=/tmp/Minefield 

I have file Bug 438240 for it and will ask for review when finished testing
Attachment #324311 - Attachment is obsolete: true
Comment on attachment 324159 [details]
installdmg.ex - This is an expect script that gets called by line 160 of packager.mk - it has to be located in the home directory of the buildslave

Same as previous comment
Attachment #324159 - Attachment is obsolete: true

Comment 43

10 years ago
As a note, please keep the _option_ the build/repackage/ship both the installer and the zip on Windows, even if Firefox does not ship it. It's probably not that hard to have this in, and SeaMonkey probably will want to have the zip available to users in some way (we have a good base of advanced users who demand it).
No longer depends on: 431905
No longer depends on: 431270
Just a personal note

make -f client.mk l10n-checkout does not checkout mozilla/other-licenses/7zstub/firefox which we need for windows
closing this bug as all this work is done. remaining work is being tracked in separate bugs
Status: ASSIGNED → RESOLVED
Last Resolved: 10 years ago
Resolution: --- → FIXED
Product: mozilla.org → Release Engineering
You need to log in before you can comment on or make changes to this bug.