Closed Bug 595812 Opened 14 years ago Closed 7 years ago

Implement L20n support

Categories

(L20n :: General, defect)

defect
Not set
major

Tracking

(Not tracked)

RESOLVED INCOMPLETE

People

(Reporter: zbraniecki, Assigned: zbraniecki)

References

(Depends on 1 open bug, )

Details

(Keywords: meta)

Attachments

(1 file)

This is a tracking bug for L20n support in Gecko.

Everything that has to be done in order to provide full internationalization support in Gecko via L20n should be added as blocking this bug.

The win scenario is when we're able to provide Gecko products localization exclusively via L20n.
Assignee: nobody → gandalf
Status: NEW → ASSIGNED
Depends on: 595813
Depends on: 595815
Depends on: 595816
Depends on: 595817
Depends on: 595819
Depends on: 595820
Depends on: 595821
Depends on: 595822
Depends on: 595826
Depends on: 611095
Depends on: 611096
bundle patch that enables l20n support in gecko platform. based on patches from bug 566906, bug 595813, bug 595815 and bug 595816.
Depends on: 613064
Depends on: 613065
Depends on: 613072
Blocks: 613939
I looked at XBL changes, I don't think this is going to fly. Adding yet more uncached js to evaluate to xbl isn't a good idea. XBL2 fixes the inline js behavior of XBL by moving js into a separate file. So either these patches have to wait on XBL2, or we need a way to avoid adding more individual js snippets. 

Would it be possible to instead of adding new markup to xbl files that we ship, preprocess them as part of l20n repacks?

Perhaps Boris/Jonas have a different opinion on that.
Unfortunately XBL2 is still some ways out so I wouldn't want to rely on it unless the l20n support is slated for a very future release.

The preprocessing idea sounds interesting though, but I don't know enough about l20n to know if it's feasible.
Ok, so same thing is going on in XUL code. Should not be evaluating uncached js snippets in UI code. Perhaps you can add a .l20n property to xul that collects all of the l20n strings and have the rest of the code GetProperty on that to get resolved strings?
We can't put the l20n file into the xul document as that'd break the split between localized and non-localized content, and as such langpacks and multi-locale builds.

So the question is more, how do we get the js cached, if that's essential?
Also note that the evaluation of the js code happens inside the content sink before we put the xul document into the xul cache, AFAICT. Not sure how we cache xbl.

That said, I don't know how we deal with cached documents that would like to use the l20n context to re-evaluate. Gandaf, do you have any tests of that code path?
(In reply to comment #5)
> We can't put the l20n file into the xul document as that'd break the split
> between localized and non-localized content, and as such langpacks and
> multi-locale builds.

I'm assuming that's a reply to comment 2. That's too bad as it would be the most efficient solution. 

> 
> So the question is more, how do we get the js cached, if that's essential?

As a precondition to efficiently caching, we need a way to combine all of the l20n code into a single js chunk that can be jsexecuted once per document (instead of once per field).

Take a look at clients of StartupCache on how the caching works.

(In reply to comment #6)
> Also note that the evaluation of the js code happens inside the content sink
> before we put the xul document into the xul cache, AFAICT. Not sure how we
> cache xbl.

We don't cache anything in xbl at the moment :(
To summarize my conversation with gandalf:
We should try to make a giant namespaced l20n singleton. This should make l20n a net performance win by:
a) Making the l20n js easier to cache
b) only loading translations once

gandalf will get back to me on whether this is feasible. 

For measuring performance differences, we can use FunctionTimers to compare startup overhead vs l10n. Normal JS profiling tools should work for other overheads.
I don't think we can do a globally namespaced l20n singleton. That's just not gonna fly with the developers, and it'd preclude us from exposing l20n to extensions, more or less.

What I think we could do is to have one global js singleton as both cache and playground for per-document objects, in which to evaluate l20n expressions. Something like, really sketched,

cache['file:///Apps/omni.jar!/locales/de/browser/chrome/browser.j20n']= {
  foo: "Hello World"
}
documents['chrome://browser/content'] = {};

Whether we'd have to copy all properties (values or getters) from the cache onto the document object, I don't know, maybe js proxies can help. Though I'm really not sure how that would impact perf. Also, there may be ways to make the big j20n files be prototypes of smaller ones, if that's good.
Depends on: 704500
Depends on: 704671
Comment on attachment 491362 [details] [diff] [review]
bundle patch (xul,xbl,stringbundle and intl/l20n)

>--- a/content/xbl/src/nsXBLContentSink.cpp
>+++ b/content/xbl/src/nsXBLContentSink.cpp

>+#include "jscntxt.h"

Please don't do that. jscntxt.h is not part of the API.
Keywords: meta
Depends on: 1270140
Depends on: 1270146
No longer depends on: 595826
No longer depends on: 595821
Depends on: 1271825
As we restart the effort to land L20n in Gecko, we're going to use a new tracking bug, migrate dependencies from this one and close it eventually.

The new bug is bug 1279002
See Also: → gecko-l20n
Component: Tracking → Localization
Component: Localization → General
Product: Core → L20n
Seven years later, we're making another attempt to refactor our l10n layer.

The new tracking bug is bug 1365426 and I'll mark the previous effort as "INCOMPLETE".
Status: ASSIGNED → RESOLVED
Closed: 7 years ago
Resolution: --- → INCOMPLETE
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: