Closed Bug 1883642 Opened 2 months ago Closed 1 month ago

Add disk size measurements for places.sqlite and favicons.sqlite


(Firefox :: Profile Backup, task, P1)




126 Branch
Tracking Status
firefox126 --- fixed


(Reporter: mconley, Assigned: kpatenio)



(Whiteboard: [fidefe-device-migration])


(2 files, 2 obsolete files)

Within BackupService, on initialization, we'll want to use IOUtils.stat to get the file size of places.sqlite and favicons.sqlite in the profile directory.

We'll want to add these as Glean scalar probes. This will be the first set to land, so we'll need to do the following:

  1. Add a metrics.yaml to browser/components/backup. The migration probes can be used as a general template.
  2. Add a namespace - browser.backup, and two probes within it - one for places_size and one for favicons_size. These should be of type quantity:
  3. Add a new async private method within the BackupService class for collecting measurements, and call that from the constructor.
  4. Have the private method use IOUtils.stat to get the file size of places.sqlite and favicons.sqlite. The stat Promise will resolve with the file size in bytes, or -1 if the file size can't be read. If it's -1, just record it as 0. Otherwise, round it to the nearest size in kilobytes, and record that.
  5. Add a new directory called tests under browser/components/backup and create a xpcshell folder in there. Within that, place a xpcshell.toml file in there that has these two lines
  6. After running ./mach build, add a new test, using ./mach addtest browser/components/backup/test/xpcshell/test_resource_measurements.js
  7. Inside that xpcshell test, use an add_setup(), and call do_get_profile and Services.fog.initializeFOG(), like this:
  8. Import BackupService, and then write a test using add_task that constructs a new BackupService instance, and then tests that the two probes have values > 0, kinda like this:

It's possible that do_get_profile won't actually do the work to create a Places and/or Favicons database, and the sizes of the files in the test will be 0. If so, there are probably some small tweaks we can make to add_setup to ensure that those databases exist.

Blocks: 1883655

And of course, don't forget to create the data review request. Maybe do that after (2) so that you can work on 3-8 while the data review is underway. Don't forget about ./mach data-review!

Blocks: 1883736
Blocks: 1883739
Blocks: 1883740
Blocks: 1883747
Assignee: nobody → kpatenio
Severity: -- → N/A
Priority: -- → P1

Adds a BackupResource abstract class to be extended by more specific resource handlers and a BackupResources module to import them into.

All imported resources will be provided to the BackupService constructor to instanciate.

Attachment #9390308 - Attachment description: WIP: Bug 1883642 - Add BackupResource abstract class. → Bug 1883642 - Add BackupResource abstract class. r=mconley,kpatenio

Comment on attachment 9390308 [details]
Bug 1883642 - Add BackupResource abstract class. r=mconley,kpatenio

Revision D203795 was moved to bug 1884995. Setting attachment 9390308 [details] to obsolete.

Attachment #9390308 - Attachment is obsolete: true
Depends on: 1884995

Need info'ing myself as a personal reminder for data review

Flags: needinfo?(kpatenio)
Attached file Data collection request (obsolete) —
Flags: needinfo?(kpatenio)
Attachment #9391092 - Flags: data-review?(jhirsch)
Comment on attachment 9391092 [details]
Data collection request

Whoops, I meant to update the file for data review request, not make a new comment. Sorry for any confusion!
Attachment #9391092 - Attachment is obsolete: true
Attachment #9391092 - Flags: data-review?(jhirsch)
Attachment #9391133 - Flags: data-review?(jhirsch)

Comment on attachment 9391133 [details]
Data collection request


Just a suggestion--while this technically falls under category 1 data, I'd encourage introducing some level of fuzzing to reduce the potential for user fingerprinting, using either a histogram to bucket the data at some finer-grained level than MB, but not providing full data visibility down to the individual kilobyte; or rounding the data off to some acceptable error level, as was done in bug 1884407.

  1. Is there or will there be documentation that describes the schema for the ultimate data set in a public, complete, and accurate way?

Yes, the usual Glean data dictionary.

  1. Is there a control mechanism that allows the user to turn the data collection on and off?

Yes, the usual Firefox data submission opt-out controls.

  1. If the request is for permanent data collection, is there someone who will monitor the data over time?

Yes, mconley will monitor.

  1. Using the category system of data types on the Mozilla wiki, what collection type of data do the requested measurements fall under?

Category 1, technical data.

  1. Is the data collection request for default-on or default-off?


  1. Does the instrumentation include the addition of any new identifiers (whether anonymous or otherwise; e.g., username, random IDs, etc. See the appendix for more details)?


  1. Is the data collection covered by the existing Firefox privacy notice?


  1. Does the data collection use a third-party collection tool?


Attachment #9391133 - Flags: data-review?(jhirsch) → data-review+
Pushed by
Add disk size measurements for places.sqlite and favicons.sqlite. r=backup-reviewers,places-reviewers,mconley,mak
Closed: 1 month ago
Resolution: --- → FIXED
Target Milestone: --- → 126 Branch
Blocks: 1887724
You need to log in before you can comment on or make changes to this bug.