User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2) Gecko/20100115 Firefox/3.6 Build Identifier: 3.0.1 final In a corporate roaming environment, the IMAP offline store can get big, so it needs to be stored in a 'Local' location that doesn't get synced to the server. The problem is, since the message pane folder/column view settings are stored in the .msf files, which are stored in the same folder with the offline store. These settings that control the message pane/folder/column views need to be abstracted out somehow so they can be included in the 'Roaming' part of a profile separate from the offline store. Maybe in an sqlite database? Or maybe they can be stored directly on the IMAP server itself (I know its possible to save 'files' to an IMAP server, thats how SyncKolab works)... Reproducible: Always
Summary: Separate the message pane folder view/column settings from IMAP offline store → Separate the message pane folder view/column settings from the.msf files/message store
This is unlikely to happen as part of core Thunderbird development anytime soon. The most likely reason it, or something like it (possibly as an extension), would happen would be Mozilla Weave integration. It would not be that hard for an extension to do if so interested. If someone gets the scaffolding set up, feel free to ask questions on mozilla.dev.apps.thunderbird about how to do so. I agree that the overarching problem of roaming profiles is not particularly addressed by Thunderbird.
Would it be totally out in left field to suggest that maybe these kinds of settings could be stored on the IMAP server itself? I know the Kolab extension allows you to store things like Contacts on an IMAP server...
Conceptually speaking it's not a bad idea. From an implementation perspective, it would be a serious undertaking. I'm not aware of any code in the Thunderbird codebase that deals with storing data as fake IMAP messages on a server. The weave case is at least an order of magnitude easier. It's an existing, working mechanism with explicit and tested semantics and communication mechanisms that comes with an API. If one were wedded to storage via IMAP, it would likely make the most sense to have Thunderbird use weave, and have a weave back-end plugin that uses IMAP folders to synchronize. All of this, again, can be done as an extension, which would likely be the only way such a thing has a chance of being developed.
(In reply to comment #3) Yeah, I didn't think it would be trivial... > The weave case is at least an order of magnitude easier. It's an existing, > working mechanism with explicit and tested semantics and communication > mechanisms that comes with an API. But I'd prefer something where I'm in control - no offense, but I don't want to store my stuff on mozilla's weave system, and my understanding is setting up my own weave server is still far from trivial... Anyway, thanks for the comments... I still miss the good old days when you could store your entire profile on an LDAP server with Netscape 4....
Status: UNCONFIRMED → NEW
Ever confirmed: true
Hardware: x86 → All
Version: unspecified → Trunk
(In reply to Andrew Sutherland (:asuth) from comment #3) > Conceptually speaking it's not a bad idea. From an implementation > perspective, it would be a serious undertaking. Ok, how about making it possible to store the .msf files in the %AppData% folder separate from the other files/folders?
Summary: Separate the message pane folder view/column settings from the.msf files/message store → Separate the thread pane view/column settings from the.msf files/message store
Summary: Separate the thread pane view/column settings from the.msf files/message store → Separate the Thread Pane view/column settings from the.msf files/message store
Marking as dupe of bug 572044, which covers this request as well as other meta-data that should be non-volatile but is stored in the volatile .msf files...
Status: NEW → RESOLVED
Last Resolved: 2 years ago
Resolution: --- → DUPLICATE
Duplicate of bug: 572044
You need to log in before you can comment on or make changes to this bug.