SDCH (shared dictionary compression over HTTP) might be particularly useful in combination with resource packages (bug 529208). The main complaint wrt resource packages is that they waste bandwidth: If I modify one file, I have to send the whole package again.
SDCH seems to be not only an interesting feature, but also a very useful one making the web faster. I'm curious why nobody has cared enough about it to discuss adding it to Firefox. (Maybe I'm wrong about it being a very useful feature?)
I was considering SDCH as a means of web acceleration. Think of a web proxy which provides highly compressed content to clients even for the sites which originally serve uncompressed content. I was puzzled to find Firefox not supporting SDCH. I wondered maybe there is something about SDCH preventing its adoption that I'm not aware of. Is its benefits too small to make it an interesing tool for a faster web? Or is it just lack of people's awareness and/or developer man-power? Or what?!
My guess is that this is still waiting to see if it ends up being widely used. From what I can tell Google hasn't rolled it out to all of their services yet (after about 5 years). And server-side support looks sparse. SDCH is a good idea, but hard to tell if this would be worth anyone's time yet. It's not impossible that something more efficient (possibly with the dictionary able to be built dynamically as responses are sent, rather than downloaded separately by the client) will come along.
linked in reports some useful experiences using sdch with chrome https://engineering.linkedin.com/shared-dictionary-compression-http-linkedin
(In reply to Patrick McManus [:mcmanus] from comment #5) > linked in reports some useful experiences using sdch with chrome > > https://engineering.linkedin.com/shared-dictionary-compression-http-linkedin Those are amazingly high compression ratios (once you have downloaded the dictionary). However, the massive pre-compilation step looks daunting. Has anyone looked at the effect on compression of the dictionary was generated with less extensive pre-compilation, or generated from a more generic set of tokens?
Can this be closed as wontfix now that we have brotli? https://bugzilla.mozilla.org/show_bug.cgi?id=366559
(In reply to Nick Desaulniers [:\n] from comment #7) > Can this be closed as wontfix now that we have brotli? > https://bugzilla.mozilla.org/show_bug.cgi?id=366559 this isn't on the roadmap right now, but I wouldn't be opposed to someone working on it. Its a different use case (i.e. sdch is best for slightly changing dynamic content)
Another use case is visiting multiple pages of web pages with repeated "framing" HTML that is the same (or almost the same) on every page -- eg. Wikipedia. SDCH would allow this framing HTML to be transmitted once, then every subsequent page would be hugely compressed thereafter. Adding SDCH-like capabilities to Brotli (see bug 366559 for that possibility) would also achieve the same goal.
2 years ago
Has Regression Range: --- → irrelevant
Has STR: --- → irrelevant
Summary: Implement SDCH → Implement support for SDCH (Shared Dictionary Compression for HTTP)
Chromium: Intent to Unship: SDCH https://groups.google.com/a/chromium.org/forum/#!topic/blink-dev/nQl0ORHy7sw "so we are on track for disabling SDCH in M59." Bye, bye, SDCH. :)
if there is standards activity here we can reconsider
Status: NEW → RESOLVED
Last Resolved: 2 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.