Closed Bug 129332 Opened 22 years ago Closed 21 years ago

nsIWebBrowserPersist should clean up files after cancel or error

Categories

(Core Graveyard :: File Handling, defect)

defect
Not set
major

Tracking

(Not tracked)

RESOLVED FIXED
mozilla1.5alpha

People

(Reporter: law, Assigned: adamlock)

References

Details

Attachments

(1 file, 5 obsolete files)

I think the implementation of this interface should be be responsible for
deleting the incomplete downloads when an error is encountered or the download
is cancelled.

Reasons:

1. It ensures that we only have to write this code once, rather potentially in
each client.

2. The client does not have sufficient information to fully clean up when the
download is saving a document (with some unknown number of related files, with
unknown names).
Nominating.  Currently, if the user cancels a download in progress, or, it fails
for any reason, a partial file remains on their disk, and, in the case of saving
a complete web page, there may be a large number of files left behind.
Keywords: nsbeta1
Keywords: nsbeta1nsbeta1-
Persist could have a flag to enable this behaviour. It does maintain a list of
files to save out to but it would have to be sure to only cleanup those it had
actually created so far and directory deletion would pose an issue too. 
Target Milestone: --- → Future
renominating.
Keywords: nsbeta1-nsbeta1
Blocks: 99642
QA Contact: sairuh → petersen
Attached patch Work in progress (obsolete) — Splinter Review
Still work in progress and untested. Patch compiles a list of opened or created
files and deletes them in the event of failure. I still have to add some logic
to only delete empty dirs.
Attached patch Patch (obsolete) — Splinter Review
This patch adds a flags to cleanup files when the save operation goes wrong or
is cancelled. 

All files created by webbrowserpersist are deleted followed by any empty dirs
from the data path downwards which are left after the file deletion.
Attachment #122881 - Attachment is obsolete: true
Comment on attachment 123061 [details] [diff] [review]
Patch

we should document/comment that this currently only works for local files and
include the bug # for the new bug which will cover "cleanup" of remote
(non-"file:" url) files.

Fix this comment:
+	 // Two passes, first cleans up files, the second empty directories
Did you mean "then" instead of "the"?

I'm not overly fond of the two passes; can you instead delete the files from
mCleanupList as you handle them?

fix typo in this line:
+		     // there are no files their either.
adt: nsbeta1-
Keywords: nsbeta1nsbeta1-
Attached patch Patch v2 (obsolete) — Splinter Review
Updated patch clarifies the description of the flag to indicate it refers to
local files only. I've also changed the comments to make them more descriptive.


The two pass cleanup is necessary because it makes it easier to test whether
directories can be safely deleted because all files will have been removed by
the first pass. Otherwise it requires maintaining two seperate lists, one of
dirs and another of files which would probably result in a lot more code for
the same result.
Attachment #123061 - Attachment is obsolete: true
Comment on attachment 123703 [details] [diff] [review]
Patch v2

Requesting r/sr on this patch. This adds a PERSIST_FLAGS_CLEANUP_ON_FAILURE
flag to the persist object which can be used to delete (local) files in the
event of failure. The client side code will still have to be patched to
actually make use of it.
Attachment #123703 - Flags: superreview?(bz-bugspam)
Attachment #123703 - Flags: review?(brade)
Comment on attachment 123703 [details] [diff] [review]
Patch v2

Remove r/sr pending new patch to address some points discussed via AIM
Attachment #123703 - Flags: superreview?(bz-bugspam)
Attachment #123703 - Flags: review?(brade)
Attached patch Patch v3 (obsolete) — Splinter Review
Patch cleans up code, removing a couple of unneeded variables and using
nsSupportsArray for holding the stack of enumerator objects which gets rid of
the DirPos struct and some new/delete operations. I've also moved the cleanup
code into CleanupLocalFiles().

The patch still contains a two pass loop rather than calling a function twice
with a pass parameter. It just seems to make as more sense to keep all the
logic contained inside the function than polluting the caller too for the sake
of an indent.
Attachment #123703 - Attachment is obsolete: true
Comment on attachment 123934 [details] [diff] [review]
Patch v3

Can I have an r/sr on this patch please? Thanks
Attachment #123934 - Flags: superreview?(bz-bugspam)
Attachment #123934 - Flags: review?(brade)
Comment on attachment 123934 [details] [diff] [review]
Patch v3

if !exists, shouldn't you remove it from mCleanupList?

Do you really need this block? (I can't tell due to my tiny window):
+		     if (!hasMoreElements)
+		     {
+			 continue;
+		     }

Do you need to build mCleanupList if the flag isn't set?  (Can someone change
the flag in the middle of persisting?)

r=brade
Attachment #123934 - Flags: review?(brade) → review+
1. When !exists I could remove it but there is cleanup loop immediately so I
didn't see the point. To remove an element would require memmoving the array up.

2. hasMoreElements tests whether the enumerator can be discarded so yes it is
required.

3. I could put if statements to skip adding elements to the cleanup list if the
flag is not set. I think my rationale was that we could add a manual cleanup()
function at some point, but I think I'll put if statements around these
additions to save some memory allocations when not used.
Target Milestone: Future → mozilla1.5alpha
Comment on attachment 123934 [details] [diff] [review]
Patch v3

Requesting sr on this patch to add a flag to the persist object to clean up
files and dirs when persistence fails.

I'll add an conditional test as Kathy suggested around the calls to add cleanup
data to save some mallocs when there is no flag.
Attachment #123934 - Flags: superreview?(bz-bugspam) → superreview?(jaggernaut)
Attached patch Patch v4 (obsolete) — Splinter Review
Same as before, just with the mentioned if blocks.
Attachment #123934 - Attachment is obsolete: true
Comment on attachment 123934 [details] [diff] [review]
Patch v3

Removing sr request because the while (hasMoreElements) block looks redundant
and possibly. I'll  put another patch together that removes it and verify that
it is working properly.
Attachment #123934 - Flags: superreview?(jaggernaut)
Attached patch Patch v5Splinter Review
Removed the redundant loop, and verified code functions correctly when deleting
dirs dependent upon them being empty.
Attachment #124495 - Attachment is obsolete: true
Comment on attachment 124508 [details] [diff] [review]
Patch v5

Requesting sr, new patch fixes the redundant loop issue. Thanks
Attachment #124508 - Flags: superreview?(jaggernaut)
Comment on attachment 124508 [details] [diff] [review]
Patch v5

sr=jag
Attachment #124508 - Flags: superreview?(jaggernaut) → superreview+
Fix is checked in on the trunk. I shall raise a new bug to cover how and when
this flag should be supplied from the client side.
Status: NEW → RESOLVED
Closed: 21 years ago
Resolution: --- → FIXED
I have opened bug 207615 to cover the browser side.
Product: Core → Core Graveyard
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: