Trying to update a large binary file with moz-phab submit yields storage error
Categories
(Conduit :: moz-phab, defect, P2)
Tracking
(Not tracked)
People
(Reporter: Pike, Assigned: Kwan)
References
Details
(Keywords: conduit-triaged)
Attachments
(2 files)
When trying to upload the icudt67l.dat
in https://phabricator.services.mozilla.com/D62732 via moz-phab submit
, I hit an error message:
No configured storage engine can store this file. See "Configuring File Storage" in the documentation for information on configuring storage engines
I tried --less-context
but that didn't help either.
Uploading with moz-phab arc diff
worked, and said that it's uploading 4 chunks.
Obviously this worked slightly different, but maybe for this particular patch, that's a good thing.
Assignee | ||
Comment 1•4 years ago
|
||
So this is just because moz-phab hasn't (yet) implemented uploading "large" (above 8MiB) binary files (understandable, I didn't bother for a while either). Anything above that needs to be split into 4MiB chunks and uploaded to file.uploadchunks
.
Here's some notes in case they help adding it to moz-phab:
First before binary upload the file.allocate
endpoint needs to be called with the file's name
, contentLength
, and contentHash
(the hexdigest of sha256 hashing the file). The return result will contain the boolean upload
and the nullable string filePHID
.
If filePHID
is non-null and upload
is false
, Phabricator already has the file and it doesn't need uploading, just bung the PHID in the change metadata (so binary files only need uploading once. except for large ones that require chunking, see bug 1562479).
If filePHID
is null
and upload
is true
upload the file to file.upload
as already done.
If filePHID
is non-null and upload
is true
, the file needs chunking.
To upload chunks, first query file.querychunks
with just the provided filePHID
, and you'll get back an array of dicts (one for each chunk) with byteStart
and byteEnd
ints, plus a complete
boolean. If complete
is true
the chunk is there, don't need to do anything. Otherwise the chunk needs uploading.
To upload chunks call file.uploadchunk
with filePHID
, byteStart
, data
, and dataEncoding
. dataEncoding
is (currently/last I checked) always the string "base64"
. data
is a base64 encode of the data of the file in the range [byteStart, byteEnd).
unofficial file.allocate docs
unofficial file.querychunks docs
unofficial file.uploadchunk docs
file upload function in phabsend-moz
chunk upload function in phabsend-moz
Updated•4 years ago
|
Assignee | ||
Comment 2•4 years ago
|
||
And just out of curiosity
$ hg files 'set:binary() and size("> 8MB")'
config/external/icu/data/icudt67l.dat
security/nss/automation/taskcluster/docker-saw/LLVMgold.so.zip
third_party/webkit/PerformanceTests/wasm-godot/godot.wasm
So this is very much a rare case.
Reporter | ||
Comment 3•4 years ago
|
||
The icudt file is updated frequently, though, https://hg.mozilla.org/mozilla-central/log?rev=external%2Ficu%2Fdata%2Ficudt. A bit tricky to follow its history, as it's changing its name with each ICU release.
Updated•4 years ago
|
Updated•4 years ago
|
Assignee | ||
Comment 5•4 years ago
|
||
file.allocate lets us know whether we can perform a normal upload, as with small
binary files, or if we need to use chunked uploading, as for files over 8MiB.
It also avoids uploading a file multiple times if Phabricator already has it.
Updated•4 years ago
|
Assignee | ||
Comment 6•4 years ago
|
||
Phabricator requires files over 8MiB to be uploaded using a different API that
splits them into chunks for doing so.
Depends on D81893
Updated•4 years ago
|
Description
•