The goal of this bug is be to probe the memory consumption issue related to the use of multiple processes , and identify if there is a possibility for sharing data across processes. Designing sharing mechanism is likely to be complex, and require some time investment. Thus we should provide solid evidence for taking such decisions based on what are the properties of the data that we could share between processes. Among the question to answer, would be the ratio of the following: 1. Amount of share-able data? a. is it worth to invest in such mechanism? 2. Kind of shared data: Strings, Script Data, Shape, Object, frozen objects? a. What are the type of data? b. Do they contain pointers, or only offsets? c. Are they mutable? Is the shared state transient? 3. Are shareable data persistent, how long? a. Do we need to collect them? b. Can we keep them as long as processes are running? c. Does the amount of data increase? d. Can the amount of data decrease? e. How large can the amount of shareable data become? As well as how these potentially shared data could be manipulated: 4. Who is responsible for sharing? a. Do we need a locking mechanism? b. Do we need a reference counting mechanism? c. Garbage collection of objects? d. Process communication? e. What happens in case of crashes? f. Which strategy do we use: create-and-deduplicate, or request-or-create? 5. What is the audience? a. Would we share chrome/jsm data? b. Would we share content data? c. Do we need to enforce security dynamically? PS: This is a small sample of questions that come to my mind, but sounds necessary to properly design such sharing mechanism for something as complex as JS.  http://www.erahm.org/2016/02/11/memory-usage-of-firefox-with-e10s-enabled/  https://groups.google.com/d/topic/mozilla.dev.platform/gJzagkAu4Ks/discussion
At the moment I don't have anything to add beyond what I wrote in .
You need to log in before you can comment on or make changes to this bug.