Closed Bug 73256 Opened 23 years ago Closed 23 years ago

[RFE] crunch JS files before packaging milestone releases

Categories

(SeaMonkey :: Build Config, enhancement, P5)

enhancement

Tracking

(Not tracked)

VERIFIED INVALID
Future

People

(Reporter: dev+mozilla, Assigned: cls)

References

()

Details

(Keywords: perf)

For Milestone releases, the JS files that Mozilla uses should be crunched with
the JavaScript Crunchinator (see URL), thus eliminating unnecessary whitespace &
comments. This will increase performance.
Severity: normal → enhancement
Keywords: perf
Summary: crunch JS files before packaging → [RFE] crunch JS files before packaging
See http://www.mozillazine.org/talkback.html?article=1867&message=28#28 for
discussion about this.
See also bug 68045 "precompile chrome JS and load it incrementally". 

Why not make a perl script that runs before making the jar files? This will save
download time, runs faster and uses less memory.
How badly would this "crunchinator" reduce the readability of the js files?
Brendan, any thoughts?  If it were enabled, it wouldn't be just for milestone
releases.  It would be for all builds (or maybe all non-debug builds).

Priority: -- → P5
Target Milestone: --- → Future
Chris, please forget readability. In fact the Brainjar "crunchinator" makes the
source completely unreadable (after running it, you won't be able to read the
source because this crunchinator makes one long string. But start thinking in
kilobytes because I saved 49.5% on a single javascript!
Wow.  That's pretty amazing.  I'm voting for this bug too ;)
Disk is cheap, some people value readability much more than the savings lost if
we don't crunch. I'm not going to make that call -- cc'ing ben, hewitt, et al.
for their thoughts.

I do think the cycle savings will be minor, and I would require someone to show
before and after wall-time and quantify measurements of startup, first-window,
etc.  We don't recompile chrome JS for second-window-of-same-class when we're
brutally sharing.  What's more, I have a bug I'm working on to precompile JS
into a fast-load format on first-run-since-install.  I just am not convinced
that this crunching is worthwhile, apart from space savings on disk, which are
worth very little.

/be
Asa: what's the target audience of milestone releases?

I think it would be ok if we supported dropin file preprocessing (which 
included chruninator) and perhaps we could have the xpi dists use it, but the 
monolithic builds should not have this.  My theory is that people who download 
monolithic builds are willing to take the extra download time.

The other question i have is how do chrunched files fair after zipping?
Summary: [RFE] crunch JS files before packaging → [RFE] crunch JS files before packaging milestone releases
would it be possible to have 2 versions of the file? 
One where the coding is done and readability is important, and the other one
which will be created by taking the full file and crunching it during the build?
The second file should probably go into a separate directory...
Of course one should first see, how much time is saved and if it is worth the
trouble...
just listening in ... adding cc:

also, just wanted to say i crunch my JS code sometimes and have saved more than 
50% in file size (no joke, recently went from 8K --> 4K and 6K --> 2K on some 
linked .JS files) ... now obviously, that's a huge savings while talking about 
downloading code over a dial-up connection ... NOW granted, i don't know what 
that would mean in terms of Mozilla and code sitting on disc and performance ... 

in terms of readability, yeah it sucks, but in the end i just keep a working 
copy and during the build process then i crunch it ..... 
hmmmmmmm .... also, could apply to all .CSS files and maybe any other files 
(xul? -- not an XML guy, don't know) and whatever else where whitespace doesn't 
matter .... ?

And this could be nearly useless, please report numbers for sizes after 
zipping.
timeless:
we are talking about performance, which might be improved, if the parser has not
to wade through known useless code.
After zipping the difference between the original and the crunched files should
hardly be noticable.
yeah, i'm not that concerned with dowload times ... i'm thinking parser 
performance when loading the thing .... 
Ok, if this is just performance, then let's resolve this as wontfix+useless and 
just focus on bug 68045
yes, I have to agree that the parser really doesn't spend that much time "wading
through known useless code".. 

I think that the crunchinator was designed to make js quick to DOWNLOAD - over a
slow link, dropping 6k to 3k could mean the difference of 1-2 seconds in
displaying a page. We're reading these files right off of disk, I don't think
that parsing the comments/whitespace is really a big deal.
until someone posts numbers on crunched vs. non-crunched builds, this is all
speculation.

The only advantage I've seen so far is in download size (definitely a concern of
Netscape as one of the 5 S's touted in Netscape 6), but since the js will all be
in compressed files already for download that doesn't buy us much.  The real win
for this bug will be if someone can show crunched js runs consistently,
measurably faster than uncrunched.  If it's faster, we can make it happen in the
release builds.
ok, you all are the experts on opening files, reading them into memory, etc. so 
whatever can be done to speed it up great ... 68045 sounds like it's definately 
got more support until someone can prove that this type of thing would 
contribute something of value ... 
(Only commenting on the _download_ size issue ... we can shrink comm.jar by 
88KB just by stripping out the MPL from the 223 files in comm.jar that have it
(yeah, I actually measured this :-). If done for all jars, this may be ~250KB,
which is a minute of download time for a dialup user. The build folks may wish
to do this although it probably needs a separate bug. (I'm assuming that a 
single license in the top level of a single jar file would be enough to satisfy 
legal requirement)).
I personally am OK with the idea of stripping out the MPL verbiage in order to 
create the jar file. (I think this came up before in some context -- maybe 
with XUL files?) I see this as no different than stripping out 
license-related comments in compiled C/C++ -- the jar file as distributed (with 
license notices stripped out or compressed) is essentially in "executable" form 
compared to the original source files. I think leaving a single license notice 
in the jar file (for all files) is OK -- presuming (an important point!) that 
all the files in the jar file are in fact under the same license.
we should probably implement about:npl, about:mpl, and about:license first -- i 
don't remember the bug id.

First, I assume we're talking about the notice ("This file is governed by the
terms of the MPL ...") and not the license itself.  There's no question about
the latter, the license itself does not belong here.

I'm generally unhappy with the idea of files that don't have any license notice,
and we can't pull these out for source releases.  But I agree with Frank that it
can be done for the executables.  And timeless is right, we should also make
sure the the notice about the MPL is visible and easy to find.

First, I assume we're talking about the notice ("This file is governed by the
terms of the MPL ...") and not the license itself.  There's no question about
the latter, the license itself does not belong here.

I'm generally unhappy with the idea of files that don't have any license notice,
and we can't pull these out for source releases.  But I agree with Frank that it
can be done for the executables.  And timeless is right, we should also make
sure the the notice about the MPL is visible and easy to find.
we should file a separate bug on the license removal to avoid confusing it with
JS crunching.
filed bug 73661 for the narrower issue of just stripping the notice, in favour
of a single notice within the '.jar' file.
See also bug 68686 which has numbers on compressed vs. uncompressed .jar files. 
We could save 530K on download by having UN-compressed jars because when the 
uncompressed .jar is compressed into the .xpi package zip is able to squeeze 
out the redundant license headers etc. which it isn't able to do when 
XUL/JS/etc is compressed on a file-by-file basis.

The minor downside is that chrome would take up an extra 3.6 Mb on the user's 
disk, and more seriously could possibly be a lot slower to load since more has 
to be read from disk.
hmmmm ..... did i read that right?

yikes, i'd take a bigger download over larger disk space and longer load time 
any day.  i guess that was the whole point.

I just vote for timeless suggestion to invalidate this bug in favor for bug
68045. After fixing 68045 (precompiling) the end user of a binary would get the
same files, since comments / whitespaces do not influence the compiled code and
we can avoid a lot of work with handling compressed / uncompressed source files.
YES
> We could save 530K on download by having UN-compressed jars because when the 
> uncompressed .jar is compressed into the .xpi package zip is able to squeeze 
> out the redundant license headers etc. which it isn't able to do when 
> XUL/JS/etc is compressed on a file-by-file basis.

> The minor downside is that chrome would take up an extra 3.6 Mb on the user's 
> disk, and more seriously could possibly be a lot slower to load since more has 
> to be read from disk.

But it doesn't have to be uncompressed which would offset the slowdown.
Ok, marking INVALID due to bug 68045.
Really doing so this time.
Status: NEW → RESOLVED
Closed: 23 years ago
Resolution: --- → INVALID
verified.
Status: RESOLVED → VERIFIED
Product: Browser → Seamonkey
You need to log in before you can comment on or make changes to this bug.