User Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:41.0) Gecko/20100101 Firefox/41.0 Build ID: 20150918100310 Steps to reproduce: Use Firefox. Actual results: Bookmark backup files (in "bookmarkbackups/") and other files (such as things in "crashes/") are lz4-compressed files, but they use a non-standard format. Result: Users cannot avail themselves of standard, commonly available tools to inspect these files, which contain *their* data. Instead they have to resort to Firefox-specific (or Mozilla-specific, same point) hacks to access their data.   Such as using the Library GUI in Firefox to export bookmarks; or using Mozilla's lz4 interfaces through XPCOM. Expected results: Mozilla should use standard file formats. You promised to switch to a standard format, once one was defined . One was defined a while ago . Standard tools are available since some time . Why are you still delaying?  https://dxr.mozilla.org/mozilla-central/source/toolkit/components/workerlz4/lz4.js#49  https://github.com/Cyan4973/lz4/blob/master/lz4_Block_format.md https://github.com/Cyan4973/lz4/blob/master/lz4_Frame_format.md  For example: https://packages.debian.org/jessie/liblz4-tool
lz4.js file has been moved to /toolkit/components/lz4/lz4.js   http://mxr.mozilla.org/mozilla-central/source/toolkit/components/lz4/lz4.js
Places will use a different format, when the platform (toolkit) will move to a different format. And I think it will be the same for all the consumers. So this is a more general Toolkit bug.
(In reply to Marco Bonardo [::mak] from comment #2) > Places will use a different format, when the platform (toolkit) will move to > a different format. To clarify, if the compressor starts creating files in the more widely supported format, and the decompressor can still support the old format (I don't see why not, since there's an header it can detect), it will be enough to fix the lz4 component in toolkit to automatically move all the consumers to the new format.
I've created an unofficial stand-alone decompressor for `.jsonlz4` files. The project with source code is hosted here: https://github.com/avih/dejsonlz4 . Initial v0.1 release can be found here https://github.com/avih/dejsonlz4/releases and includes a Windows executable `dejsonlz4.exe`. It should hopefully compile easily elsewhere too. Please take any discussions regarding this project to the project page on github.
(In reply to Anthony Thyssen from comment #5) > Could you include in your source code "README.md" file ... (In reply to Avi Halachmi (:avih) from comment #4) > Please take any discussions regarding this project to the project page on github.
(In reply to S from comment #7) > (In reply to Avi Halachmi (:avih) from comment #6) > > (In reply to Avi Halachmi (:avih) from comment #4) > > > Please take any discussions regarding this project to the project page on github. > > Thanks for the piece of code to decompress. But do you have any idea to go > the other way and compress? Please take this to GitHub, this can't be answered here.
The current obfuscation by changing the magic field in the lz4 compressed files doesn't provide any security enhancements. For advanced users it is now more complicated to "patch" or "synchronise" configuration files. So +1 for migrate to standard lz4 compressed or uncompressed files.
(In reply to H.-Dirk Schmitt from comment #9) > The current obfuscation by changing the magic field in the lz4 compressed > files doesn't provide any security enhancements. There is no obfuscation will, nor security enhancement here. At the time when lz4 was added there wasn't a standard format, thus a very simple header had been created in front of the payload. Now a standard exists, but nobody internally had the time to convert the encoder/decoder for it, nor anyone volunteered to do that yet.
8 months ago
I would like to support this change in regards to search engines. Currently, I'm forced to use side tools and export/import is severely limited. Also, I don't really see how compression/signing really prevents search hijacking when (from my experience) most of the time it's done by side software with all the needed tools for that.