Closed Bug 1534483 Opened 6 years ago Closed 6 years ago

Ambiguous zip parsing allows hiding add-on files from linter and reviewers

Categories

(addons.mozilla.org :: Security, defect)

defect
Not set
normal

Tracking

(Not tracked)

VERIFIED FIXED

People

(Reporter: david, Assigned: mat)

References

()

Details

(Keywords: reporter-external, sec-moderate, wsec-applogic, Whiteboard: [reporter-external] [web-bounty-form])

Attachments

(2 files, 1 obsolete file)

The add-ons submission and reviewing pipeline makes use of four zip parsers, each of which implements details of the zip format differently:

The parsing differences make it possible to to construct an ambiguous add-on xpi that contains different files depending on what parser is in use. The effect is that you can submit an add-on to addons.mozilla.org that appears benign at review time, but becomes malicious after being signed and made available for download.

The attached file ambiguous_addon_source.zip contains ambiguous_addon.xpi, a proof of concept. Its xpi zip container is specially crafted to present different files to different parsers. It passes addon-linter with no warning or errors. The reviewer tools say the add-on contains only three harmless files. After being signed, the add-on contains additional, malicious files (in the demo they don't actually do anything bad). The "Step-by-step instructions" show how to try the proof of concept against a local installation of addons-server. The Appendix explains how it works, and the attached program make-ambiguous-addon shows how it was made.

Attack scenario

The attack requires an attacker to upload a malicious add-on and get a user to install it. The advantage afforded by the vulnerability is that the attacker can hide the add-on's malicious nature from reviewers, making it more likely to be accepted and signed.

One avenue of attack would be to create a new add-on and convince people to install it. However it would probably be more effective to take advantage of an existing user base. Suppose an attacker compromises the Firefox account of the developer of a popular add-on. Then they could make a trivial update and submit an upgrade, along with arbitrary malicious files. The add-on's existing users would be infected once they upgraded.

The effect of the attack is anything harmful that an extension can do—code execution with extension-level privileges. This could include, for example, running a cryptominer, clicking on ads, or modifying proxy settings to monitor a user's browsing; but would not include, for example, directly modifying the filesystem. In order to use privileged JavaScript APIs, the extension will have to declare permissions in its manifest, but the manifest that is seen by reviewers see may differ from the manifest that gets signed. That is, a reviewer may see "no permissions," and the add-ons page at addons.mozilla.org may say "no permissions," but on installation the end user will still see a dialog declaring all the permissions the add-on actually has. (I'm not sure how this last point interacts with automatic upgrades.)

It's likely that the attack will be detected if the signed xpi is examined, because autograph re-packs the files into a new zip container and removes all the parsing tricks of the original. The proof of concept makes the malicious nature of the re-packed add-on apparent, but an attacker could of course be more or less artful in disguising it.

The proof of concept needs the autograph /sign/file endpoint to be enabled (as opposed to /sign/data). /sign/data ultimately uses the same Python zip parser as addons-server itself (via signing_clients.apps.JarExtractor), so the files that get signed are the same as the files that get reviewed. Parsing ambiguity still exists, but offhand I couldn't think of a way to make it into as effective an attack. I understand from #9308 that /sign/file is meant to become the default endpoint.

Step-by-step instructions

  1. You start in the role of sysadmin. Check out the source code from https://github.com/mozilla/addons-server. Before starting it, we'll make a couple of small changes:
    • In call_signing in src/olympia/lib/crypto/signing.py, add an or True to the waffle.sample_is_active('activate-autograph-file-signing') condition. This is to force use of the autograph /sign/file endpoint.
    • In _queue in src/olympia/reviewers/views.py, comment out the qs.filter on files.is_webextension. This is to make WebExtensions show up in the reviewer tools. The auto_approve cron job wasn't working for me, and anyway we want to see what a reviewer would see if the add-on were not auto-approved.
  2. Start addons-server according to the install docs. This process was finicky for me. The Appendix has some suggestions of things to try if it's not working.
  3. At the make initialize_docker step, create a superuser that you can remember for later. I used username user and email user@localhost.
  4. Now you are an attacker in the role of add-on developer. Log in with the superuser account. (You could use a different account for each role, but for simplicity I used the superuser for everything.) I didn't know the "right" way to log in, so I did it like this:
    1. Hack wrapper inside login_required in src/olympia/amo/decorators.py, and add the lines:
      from django.contrib.auth import login
      from django.db.models import Q
      from olympia.users.models import UserProfile
      login(request, UserProfile.objects.get(Q(email="user@localhost")))
      
    2. Visit a URL that requires login, like http://olympia.test/un-US/developers/addon/validate.
    3. Remove the hack from src/olympia/amo/decorators.py. (You can't just leave it in, because it will interfere with CSRF tokens.)
    4. Now you are logged in.
  5. Go to http://olympia.test/en-US/developers/ and click "Submit Your First Add-on".
  6. Agree to the Distribution Agreement and Firefox Policies and Rules.
  7. "How to Distribute this Version?" On this site.
  8. Select ambiguous_addon.xpi. Notice it is linted with no errors or warnings. Click "Continue".
  9. "Do You Need to Submit Source Code?" No.
  10. At "Describe Add-on", notice the "good aspect" in the summary. Select any 2 categories and a license. Click "Submit Version".
  11. Now you are a reviewer. Go to http://olympia.test/en-US/reviewers/queue/new and click the new add-on. If you don't see it, go back and check the qs.filter note from step 1.
  12. Notice "Permissions: None". Click "Validation" and notice there are no warnings or errors.
  13. Click "Contents" and notice that there are only 3 files:
    manifest.json
    background.js
    good.html
    
  14. Right-click on "All Platforms" and download ambiguous_add_on-1.0-fx.xpi. Go to about:debugging and "Load Temporary Add-on". Notice the "Being a good add-on" popup. Remove the temporary extension.
  15. Click "Approve", type some text in the Comments box, and click "Save".
  16. Now you are a user. Go to http://olympia.test/en-US/firefox/addon/ambiguous-add-on/. Notice the summary says "good aspect." Notice that no permissions are declared. Right-click "Add to Firefox" and save ambiguous_add_on-1.0-fx.xpi.
  17. Notice the file is signed and contains files that were not visible to the reviewer and would not have passed review.
    manifest.json
    background.js
    bad.html
    bad.js
    badwords.js
    coinhive.js
    snoop.js
    META-INF/cose.manifest
    META-INF/cose.sig
    META-INF/manifest.mf
    META-INF/mozilla.sf
    META-INF/mozilla.rsa
    
  18. If the add-on were signed by the real addons.mozilla.org key, you would be able to install it directly. But it's not, so you have to load it temporarily. Go to about:debugging and load it using "Load Temporary Add-on". Notice the "Being a bad add-on" popup. In the browser console (Ctrl+Shift+J), notice the messages (it's not really doing these things):
    saying bad words...
    mining cryptocoins...
    installing a proxy to monitor all your traffic...
    
  19. Go to about:addons and notice you have installed the "bad aspect."

Remediation

I recommend applying a zip normalization step at the beginning of the pipeline. Before the submitted xpi is sent to addons-linter or anything else, extract and re-pack it using any zip library (doesn't matter which one). Don't use the original submitted file for anything after the normalization step. Normalization will enforce consistent interpretation of the zip container by all steps in the pipeline, modulo other parser bugs.

Consider not allowing the optimized layout format (with a Central Directory at offset 4) when parsing add-on files in nsZipArchive.

You could try re-linting or re-reviewing add-ons (or a random sample) after they have been signed. Maybe you already do this, I don't know.

Thanks

Thanks to @mat on #amo IRC for helping me get a local test environment running.

The WebExtension documentation on MDN is beyond amazing.

Appendix: summary of parsing differences

These are some of the differences in zip parsing logic that give rise to the vulnerability. The yauzl parser, with its strict EOCD comment length check, is the MVP: without it, exploitation of the vulnerability would be much less complicated. You would only need to put two EOCDs at the end of the file: one with a comment length greater than the number of bytes remaining (which Python will accept and Go will ignore), and another, earlier in the file, with a comment length less than or equal to the number of bytes remaining (Go will accept and Python will never reach). As it is, yauzl puts constraints on the exploit, namely that the files addons-linter sees has to be a subset of the files autograph sees, and so that subset cannot be fully malicious on its own.

EOCD search

  • yauzl

    Search backwards for PK\x05\x06. Consider only the first match and assert that the comment length is exactly equal to the number of bytes after the EOCD.

  • Python zipfile

    First, do a special-case check for an EOCD with a zero-length comment at the end of the file. Second, search backwards for PK\x05\x06 in the final 64×1024 bytes and take the first match, without any check on the comment length.

  • nsZipArchive

    First, check for a Central Directory signature near the beginning of the file, at offset 4—this is the optimized jar layout from #559961/#589368 that doesn't have an EOCD. If that fails, then search backwards for PK\x05\x06 and take the first match, without any check on the comment length.

  • Go archive/zip

    Search backwards for PK\x05\x06 in all but the final 65×1024 bytes. Take the first match whose comment length does not exceed the number of bytes after the EOCD.

Central Directory size

The EOCD contains fields specifying cd_num_entries, the number of entries in the Central Directory; and cd_size, the size in bytes of the Central Directory.

  • yauzl

    Ignore cd_size and read exactly cd_num_entries entries.

  • Python zipfile

    Ignore cd_num_entries and read exactly cd_size bytes.

  • nsZipArchive

    Ignore both cd_num_entries and cd_size. Continue reading Central Directory Headers as long as the PK\x01\x02 signature is found. Expect a PK\x05\x06 EOCD signature at the end.

  • Go archive/zip

    Ignore cd_size. Continue reading Central Directory Headers as long as the PK\x01\x02 signature is found. Once finished, assert that the number of headers found is equal to cd_num_entries modulo 65536.

Base file offset

  • yauzl

    Takes offsets relative to the start of the file.

  • Python zipfile

    Assumes that the Central Directory always immediately precedes the EOCD, and adjusts the implied start of the file to make the EOCD's specified offset to the Central Directory consistent. For example, if the file is 2000 bytes long, and the EOCD says that the Central Directory is located at offset 800 and has size 200, then the parser will assume that there is garbage at the beginning of the file and the zip actually starts at absolute offset 2000−22−800−200 = 978. It will look for the Central Directory at offset 1778 (i.e., immediately preceding the EOCD), not at offset 800. Similarly the offsets to Local File Headers within the Central Directory will be adjusted.

  • nsZipArchive

    Takes offsets relative to the start of the file.

  • Go archive/zip

    Takes offsets relative to the start of the file.

Info-ZIP unzip seems to use the Python algorithm initially (assume adjacent Central Directory and EOCD), and warn about how many bytes it is ignoring. But if it doesn't find a Central Directory Header signature there, it falls back to absolute offsets.

Appendix: how the malicious add-on is constructed

The attached source code bundle has tools to show how the proof of concept xpi is parsed by yauzl, Python, and Go.

$ npm install yauzl
$ node ziplist-yauzl.js ambiguous_addon.xpi
446	manifest.json
58	background.js
227	bad.html
$ python3 ziplist-python ambiguous_addon.xpi
328	manifest.json
59	background.js
160	good.html
$ go run ziplist-go.go ambiguous_addon.xpi
446	manifest.json
58	background.js
227	bad.html
415	bad.js
99	badwords.js
131	coinhive.js
66	snoop.js
0	META-INF/manifest.mf
...65531 more copies of META-INF/manifest.mf...

The zip contains three separate Central Directories:

  1. One for nsZipArchive, located at the beginning of the file, using the optimized jar layout that is peculiar to nsZipArchive.
  2. One for Python, located just before the EOCD at the end of the file, where Python will always look for it no matter what offset is given in the EOCD.
  3. One for yauzl and Go, located at the absolute offset given by the EOCD. This could be considered two overlapping Central Directories, because yauzl reads 3 entries from it and Go reads 65536+3 entries from it.

In order to make the number of Central Directory entries come out right modulo 65536, we have to pad the Go Central Directory with dummy Central Directory Headers. There are many padding Central Directory entries, but to save space, they all point to a single Local File. We use the filename "META-INF/manifest.mf" for the padding files, because files with that name are stripped from the input before signing. If we used a different name, autograph would not strip the padding files and would switch to Zip64 format in order to properly represent all 65536+ files, thereby producing a Zip64 file that nsZipArchive cannot parse.

The nsZipArchive and Python Central Directories could potentially point to different sets of files, but we don't need to make use of that degree of flexibility.

The nsZipArchive optimized jar layout doesn't even look at the EOCD. The EOCD contains a single static value for the Central Directory offset, but it is interpreted differently by Python and yauzl/Go (see "Base file offset" above).

The "semi-malicious files" are not seen by human review and only have to avoid raising linter errors. The "fully benign files" are exposed to human review. The "fully malicious files" are only seen by the end user.

   4 bytes of padding (required for nsZipArchive layout)
   ╭nsZipArchive Central Directory starts here
   │ Central Directory Header "manifest.json" ─────────────────┐
   │ Central Directory Header "background.js" ─────────────────│┐
   │ Central Directory Header "good.html" ─────────────────────││┐
   ╰nsZipArchive Central Directory ends here                   │││
   "PK\x05\x06" (required for nsZipArchive layout)             │││
   ╭semi-malicious files start here                            │││
   │ Local File "manifest.json" <────────────────────┐         │││
   │ Local File "background.js" <────────────────────│┐        │││
   │ Local File "bad.html" <─────────────────────────││┐       │││
   ╰semi-malicious files end here                    │││       │││
   ╭fully malicious files start here                 │││       │││
   │ Local File "bad.js" <──────────────────────┐    │││       │││
   │ Local File "badwords.js" <─────────────────│┐   │││       │││
   │ Local File "coinhive.js" <─────────────────││┐  │││       │││
   │ Local File "snoop.js" <────────────────────│││┐ │││       │││
   ╰fully malicious files end here              ││││ │││       │││
   Local File "META-INF"/manifest.mf" <─────────││││─│││─────┐ │││
╔═>╭╭Go and yauzl Central Directory starts here ││││ │││     │ │││
║  ││ Central Directory Header "manifest.json" ──────┘││     │ │││
║  ││ Central Directory Header "background.js" ───────┘│     │ │││
║  ││ Central Directory Header "bad.html" ─────────────┘     │ │││
║  │╰yauzl Central Directory ends here          ││││         │ │││
║  │ Central Directory Header "bad.js" ─────────┘│││         │ │││
║  │ Central Directory Header "badwords.js" ─────┘││         │ │││
║  │ Central Directory Header "coinhive.js" ──────┘│         │ │││
║  │ Central Directory Header "snoop.js" ──────────┘         │ │││
║  │ 65532 × Central Directory Header "META-INF/manifest.mf" ┘ │││
║  ╰Go Central Directory ends here                             │││
║  ╭fully benign files start here                              │││
║  │ Local File "manifest.json" <──────────────┬───────────────┘││
║  │ Local File "background.js" <──────────────│┬───────────────┘│
║  │ Local File "good.html" <──────────────────││┬───────────────┘
║  ╰fully benign files end here                │││
╠═>╭Python Central Directory starts here       │││
║  │ Central Directory Header "manifest.json" ─┘││
║  │ Central Directory Header "background.js" ──┘│
║  │ Central Directory Header "good.html" ───────┘
║  ╰Python Central Directory ends here
╚══EOCD ambiguously pointing to two Central Directories

Appendix: local addons-server troubleshooting

The addons-server install docs didn't work for me directly. I get the impression, from conversation on IRC, that part of the problem is out-of-date prebuilt Docker containers, so YMMV. Here are some of the things I did to get it working.

  • After running docker-compose up -d, wait a few seconds and run it again. Often the nginx container crashes the first time and needs to be restarted.
  • Run docker-compose exec web bash and then chmod a+rwx /code/storage/files/temp inside the container, to prevent "permission denied" errors on upload.
  • Check logs/docker-celery.log for error messages. If it's hung up, you likely need to restart the worker container and run make initialize_docker again.
  • Run docker-compose exec worker bash and run python manage.py check. If it gives errors about missing modules, run pip install geoip2 django-extended-choices python-magic. Then exit the container and docker-compose restart worker.
    • This is what to do if the initial Running all indexation tasks never finishes. logs/docker-celery.log will have an error about AppRegistryNotReady("Models aren't loaded yet."). You need to install the missing Python modules and restart worker. Then stop the hung make initialize_docker and run docker-compose exec web make populate_data.
  • Repeat the python manage.py check step with web in place of worker.
  • In src/olympia/lib/settings_base.py, change multidb.PinningReplicaRouter to multidb.PinningMasterSlaveRouter and add a new line SLAVE_DATABASES = REPLICA_DATABASES after the definition of REPLICA_DATABASES.
  • If you're having trouble with redirect loops, try rebuilding a fresh addons-frontend image.
    addons-frontend$ docker build . -t addons/addons-frontend:latest
    addons-server$ docker-compose stop addons-frontend
    addons-server$ docker-compose up -d addons-frontend
    
  • For trailing-slash redirect loops, you may have to edit some of the routes. I had to add a /? to the ^(?P<type>fragment|file)/(?P<key>.*)$ route in src/olympia/files/urls.py.
Flags: sec-bounty?

Hi David, thanks for the detailed report and great write up! I will dig into this to confirm in a bit.

+:cgrebs on the AMO team

This is cool. I was able to upload the POC as an internally distributed addon (skipping review; no public distribution) to AMO prod and it:

  • AMO validated it without errors
  • autograph happily signed it
  • unzip on the source xpi just lists extracts good files with a warning:
warning [ambiguous_addon.xpi]:  4326178 extra bytes at beginning or within zipfile
  (attempting to process anyway) 
 extracting: manifest.json           
 extracting: background.js           
 extracting: good.html    

but signed file includes:

  inflating: bad.html
  inflating: bad.js
  inflating: badwords.js
  inflating: coinhive.js
  inflating: snoop.js

I tried installing the signed addon on Nightly, but get an error:

1552357610214 addons.xpi WARN Invalid XPI: Error: Cannot find id for addon /home/gguthe/Downloads/ambiguous_add_on-1.0-fx.xpi

However, this should be easy to patch in the build script and bypassing lint and review is already bad.

+:autrilla or :wezhou can we scale down COSE /sign/file signing to 0%? as David pointed out that'll mitigate the problem for new signings:

/sign/data ultimately uses the same Python zip parser as addons-server itself (via signing_clients.apps.JarExtractor), so the files that get signed are the same as the files that get reviewed

Status: UNCONFIRMED → NEW
Ever confirmed: true
Flags: needinfo?(wezhou)
Flags: needinfo?(autrilla)
Whiteboard: [reporter-external] [web-bounty-form] [verif?] → [reporter-external] [web-bounty-form]

Thanks david! I was able to sign in stage and load a malicious addon as reported (at https://addons.allizom.org/firefox/downloads/file/710075/ambiguous_add_on-1.0-fx.xpi?src=devhub but it's only visible to me afaik) diff against the original source make-ambiguous-addon script:

190a191,196
>     "browser_specific_settings": {
>       "gecko": {
>         "id": "jid1-Kt2kYYgi32zPuw@jetpack.com",
>         "strict_min_version": "57.0"
>       }
>     },
228a235,240
>     "browser_specific_settings": {
>       "gecko": {
>         "id": "screenshots@mozilla.org",
>         "strict_min_version": "57.0"
>       }
>     },

This lets an attacker sign with a different ID than the submitted addon ID, which can increase the apparent legitimacy of the malicious addon e.g. the above addon is shown as "screenshots@mozilla.org" in about:debugging.

The signature files will still have the original IDs since addons-server parses and submits that as the cert CN (so addon takeover and the permissions are intact).

(In reply to Greg Guthe [:g-k] [:gguthe] from comment #2)

I tried installing the signed addon on Nightly, but get an error:

1552357610214 addons.xpi WARN Invalid XPI: Error: Cannot find id for addon /home/gguthe/Downloads/ambiguous_add_on-1.0-fx.xpi

Oops, you're right. I just attached a corrected v1.1 that sets an id in both manifests (same as you did in comment 4). Now I'm able to install the signed addon on Firefox Developer Edition (I set xpinstall.signatures.required=false but of course an AMO-signed file wouldn't need that).

Tried to set the sample to 0 but got an error.

$ ./manage.py waffle_sample activate-autograph-file-signing 0
CommandError: You need to specify a sample name and percentage.

and

$ ./manage.py waffle_sample activate-autograph-file-signing 0.0
usage: manage.py waffle_sample [-h] [--version] [-v {0,1,2,3}]
                               [--settings SETTINGS] [--pythonpath PYTHONPATH]
                               [--traceback] [--no-color] [-l] [--create]
                               [name] [percent]
manage.py waffle_sample: error: argument percent: invalid int value: '0.0'
Flags: needinfo?(wezhou)

(In reply to david from comment #5)

of course an AMO-signed file wouldn't need that).

Yeah, I think AMO would add the ID. I don't have review rights, so I self-signed. Great PoC regardless.

(In reply to :wezhou from comment #6)

Tried to set the sample to 0 but got an error.

Thanks wei, it looks like the management commands parse ints but the DB allows decimals https://github.com/django-waffle/django-waffle/search?q=percent&unscoped_q=percent

:autrilla when you're online can you try to set the waffle sample rate for autograph to 0.0 in the DB?

[root@ip-172-31-31-209 autrilla]# docker exec 5f206e8ea008 ./manage.py waffle_sample -l
Samples:
disco-pane-store-collections: 10.0%
csp-store-reports: 1.0%
paypal-disabled-limit: 10.0%
autosuggest-throttle: 0.0%
activate-autograph-file-signing: 0.0%

Done

Flags: needinfo?(autrilla)
Assignee: nobody → jvehent
Group: websites-security → client-services-security
Component: Other → Security
Keywords: sec-high
Product: Websites → addons.mozilla.org
Group: client-services-security → addons-security

This is the highest quality report I've seen in a long time. Thank you so much for filing it!

I recommend applying a zip normalization step at the beginning of the pipeline.

That's also my recommendation. Can we get a roundtrip through Python zipfile added to the submission process?

I'll file another bug to audit existing addons.

Flags: needinfo?(scolville)
Flags: needinfo?(cgrebs)

Reducing to sec-moderate because exploitation was only possible for a couple of weeks during which we used both Python zipfile and Go's archive/zip in parallel. We'll review extensions signed during this period, but this bug is effectively mitigated by going back to the previous signing method that only uses Python zipfile.

Keywords: sec-highsec-moderate

Can we get a roundtrip through Python zipfile added to the submission process?

So, reading the very detailed report (thanks!) this seems to be the "easiest" way to fix all this is for addons-server to unpack and repack uploaded add-ons and only use the repacked XPI file for addons-linter, signing etc. That may be problematic for bigger uploads as this will add much more load on the linting process and make it slower.

Do we care about the original upload?

I think if the repack process finds errors, we should save the original upload along with a report for AMO admins to review (just a log line we can alert on). This will help fix linting/repacking issues, but also investigate potential abuses.

David: the ambiguous xpi from comment 3 sends Go's archive/zip into an infinite loop (tested with 1.11.5 and 1.12). Which version of Go did you test this on?

Nevermind, this isn't an infinite loop but 65536 references to manifest.mf. The poc works.

Flags: needinfo?(david)
Assignee: jvehent → mpillard

Our plan is to unpack and repack it in the validation pipeline in addons-server, before we get to the linter.

Flags: needinfo?(scolville)
Flags: needinfo?(cgrebs)
Flags: needinfo?(david)
Group: addons-security → client-services-security
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → FIXED
Group: client-services-security

This was verified on AMO -dev and stage (where COSE signing is still enabled) and on prod with FF66, Win10x64

Using the sample addon provided, these are the new validation results:

  1. Package uploaded - contents before signing:

good.html
background.js
manifest.json

  1. Package contents after signing:

META-INF
good.html
background.js
manifest.json

Status: RESOLVED → VERIFIED
Flags: sec-bounty? → sec-bounty+

I confirm that tag 2019.03.21 fixes the problem. ambiguous_addon.xpi is normalized to Python's view of the file, and that's what gets signed.

Normalization of ambiguous_addon.xpi results in a benign file, so here's a complementary test case where normalization results in a malicious file. I didn't make this one especially tricky. It just contains two central directories. If unpacked with Python or Info-ZIP, it contains:

manifest.json
coinhive.js

If unpacked with nsZipArchive, yauzl, or Go, it contains:

manifest.json
safe.js

Before the fix, ambiguous_addon2.xpi passes the linter with no warnings. The reviewer tools show coinhive.js, but it isn't highlighted as it would be if the linter had seen it. If signed, the signed add-on will be the benign aspect.

After the fix, the linter raises 3 warnings. The reviewer tools highlight the warnings. If signed, the signed add-on is the same malicious aspect that the rest of the pipeline saw.

Keywords: wsec-applogic
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: