headless screenshot is too quick: the page hasn't finished loading
Categories
(Firefox :: Headless, defect, P3)
Tracking
()
People
(Reporter: rn214, Unassigned)
References
Details
Comment 1•7 years ago
|
||
Updated•7 years ago
|
Comment 2•7 years ago
|
||
Comment 3•7 years ago
|
||
Updated•7 years ago
|
Comment 4•7 years ago
|
||
Comment 5•7 years ago
|
||
Updated•6 years ago
|
| Reporter | ||
Comment 7•5 years ago
|
||
It's still the case in 71.0
(Btw, for ubuntu users, this bug became more urgent, because with the recent decision to use "snap" packaging for Chromium the chromium screenshot feature is broken for all automated testing)
Please also consider the other use-case where --timeout is needed, for self-tests.
We need to use the browser to run a complex UI Javascript, but we might not actually care about the output screenshot itself, only the fact that, when the page loads, JS has side-effects we want to trigger on the server.
Comment 8•4 years ago
|
||
(In reply to Richard Neill from comment #7)
It's still the case in 71.0
(Btw, for ubuntu users, this bug became more urgent, because with the recent decision to use "snap" packaging for Chromium the chromium screenshot feature is broken for all automated testing)
snap Chromium also does not support speechSynthesis.getVoices() or speechSynthesis.speak().
Download Chromium from https://download-chromium.appspot.com/. One caveat is MP4 encoding and decoding is not support at default Chromium build https://bugs.chromium.org/p/chromium/issues/detail?id=601636#c29.
In this case :screenshot at Web Console https://developer.mozilla.org/en-US/docs/Tools/Web_Console and Firefox Screenshots https://screenshots.firefox.com/ work, --headless --screenshot only consistently hangs and consumes computing power while doing so.
Comment 9•4 years ago
|
||
(In reply to guest271314 from comment #8)
In this case
:screenshotat Web Console https://developer.mozilla.org/en-US/docs/Tools/Web_Console and Firefox Screenshots https://screenshots.firefox.com/ work,--headless --screenshotonly consistently hangs and consumes computing power while doing so.
If you don't specify a URL to screenshot you will indeed hang at the moment. See also the observation on bug 1651542 comment 92. The patch on bug 1588152 should fix this shutdown hang hopefully soon.
Comment 10•4 years ago
|
||
If you don't specify a URL to screenshot you will indeed hang at the moment.
A URL is provided. Have been testing with Nightly 83 and MDN pages
~$ firefox/firefox --screenshot https://developer.mozilla.org/en-US/docs/Tools/Taking_screenshots
*** You are running in headless mode.
until CTRL+C
^CExiting due to channel error.
Exiting due to channel error.
Comment 11•4 years ago
|
||
Oh, then it's most likely bug 1563725. Specifying a temporary profile location with -profile should give you a workaround. But that's actually all a bit off-topic.
Comment 12•4 years ago
|
||
(In reply to Henrik Skupin (:whimboo) [⌚️UTC+2] from comment #11)
Oh, then it's most likely bug 1563725. Specifying a temporary profile location with
-profileshould give you a workaround. But that's actually all a bit off-topic.
FWIW installed Firefox as a .deb to test what was different from using Nightly, as had previously successfully captured screenshots at default Firefo install at *nix.
Isolated the issue to
"Always ask you where to save files"
being set at Preferences instead of
"Save files to" </path/to/directory>
This should be conspicuously documented somewhere in Mozilla publications.
Comment 13•4 years ago
|
||
This bug has been open for 4 years.
How about adding a "--delay=<ms>" command-line flag?
Could this be an example of the quote "The perfect is the enemy of good"? A long time ago, someone wrote "Ideally, however, we would be able to detect page completion without requiring the user to specify a heuristic like a time delay." It sounds complicated. How about a simple flag, which causes a simple delay, and then headless screenshots will work.
| Reporter | ||
Comment 14•3 years ago
|
||
If you control the webserver, you can bodge a timeout by embedding a 1 pixel iframe with display:none, and then make the iframe contain an image, and make the server delay the image for a couple of seconds. e.g.
<iframe src='htto://localhost/delay.php?sleep=5000ms' style='display:none' width=1 height=1></iframe>
Comment 15•3 years ago
|
||
Can confirm this is still reproducible on the latest Firefox Nightly 97 version, tested on MacOS 10.15.
The screenshot is done before the page is properly loaded. Updating to the current severity rankings.
Comment 16•3 years ago
|
||
I can still confirm this for Mozilla Firefox 91.10.0esr on Oracle Linux 8.6
Without adding a delay command line option the screenshot feature seems to be pretty useless.
Comment 17•2 years ago
|
||
Can we please get something done about this? Even if it's as clunky as hard-coding a one second delay, that would be greatly appreciated. As it is now, there are certain webpages that one simply cannot capture, because the capture occurs before the page content has fully loaded. It captures the text and framework of the page, but all responsive content is blank, because the event that triggers such content to start loading is the same one that triggers the screenshot to be taken, thinking that the page has fully loaded.
| Reporter | ||
Comment 18•2 years ago
|
||
If it helps, a somewhat clunky, but reliable workaround can be done by using xwd (x-window-dump).
Here is a way I found of doing it which is reasonably reliable. I use palemoon rather than firefox below, partly because it is not snap-packaged (which means it can run under an account whose homedir is outside of /home, e.g. invoked via apache) and partly because of the need to killall to make it exit - which means that we don't want to kill the desktop firefox. It should work similarly with firefox. All the wrappers are there to make sure that multiple instances run consecutively, never concurrently. There should be no linebreaks in this command; I've added them for legibility...
mkdir /tmp/fake_home_for_this timeout/ ;
export HOME=/tmp/fake_home_for_this timeout/ 60 xvfb-run -a -s '-screen 0 1600x1200x24'
flock -w 50 /tmp/fake_home_for_this timeout_flockfile
sh -c "palemoon --new-instance --no-remote --setDefaultBrowser --width 1600 --height $1200 $url &
sleep 6; xwd -root | convert xwd:- output_file.pdf;
killall palemoon;" 2>&1 ;
rm -rf /tmp/fake_home_for_this timeout/
The key pieces are:
- You need a dummy home, so that firefox/palemoon/chromium/midori don't clutter it up with prefs and old data.
- Flock, so that only one instance runs at a time.
- xvfb-run lets it run in a dummy X-session that isn't your own one (and set the virtual screensize)
- make the browser launch the same window-size as the screen size
- xwd | convert in order to screengrab and convert. (imagemagick can do pdf, jpg,png etc according to extension).
- sleep is a guess, enough to start the browser and load the content.
- Ensure the $url is quoted to protect it from the shell.
It's brutally ugly, but it does work reliably. HTH!
Comment 19•4 months ago
|
||
Yep, I confirm that the CLI screenshot functionality is not really reliable anymore on many webpage now that loads dynamically.
Description
•