Closed Bug 507795 Opened 15 years ago Closed 5 months ago

Default font sizes should be in 'points', not pixels due to varying DPI

Categories

(Thunderbird :: Preferences, defect)

defect

Tracking

(Not tracked)

RESOLVED WONTFIX

People

(Reporter: mozilla, Unassigned)

References

Details

User-Agent:       Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 (.NET CLR 3.5.30729)
Build Identifier: version 2.0.0.22 (20090605)

This is a long outstanding bug that causes display problems across all the browsers and different systems.  Fonts are sized in *points*.
10-point font=Pica =10cpi, Elite gives you about an 8pt font for 12cpi.

Specifying default fonts under "Options->Display->Formatting, (Fonts->), you ask
for a size in pixels.  This is a serious design flaw.   A normal user will have no idea how to answer here.  Instead, one *assumes*, the system DPI is set correctly for their screen (have to start with basics).  People complain about setting the DPI too high because code that *tries* to look correct by using point sizes, gets reduced, way small.  

BAD MS-DOS programs that use pixels for sizing will have to set their values extraordinarily high to see text using pixel sizing.  But only point sizing is able to be the same across different devices.

Please STRONGLY considering fixing this BUG ASAP -- and allow the user to enter
the size of their desired font in points (in fact, suggest, that they set their DPI accurately, and then set the point size -- they they and those they communicate with no matter what monitor size they use, will get the same size text and the same readability -- which is really the point -- effective communication.

I do not consider this an RFE, as it is a Design Flaw and never should have been mis-implemented this way.  

Pixels may be good for local programs running on known HW, but as soon as you go with variable screens,  with text read across multi-hardware platforms, you need to go with standards.



Reproducible: Always
Confirming as Thunderbird equivalent to SeaMonkey bug 186718 and to split this off bug 469303 as a separate topic (also see supporting discussion there).
Blocks: 452711
Status: UNCONFIRMED → NEW
Component: General → Preferences
Ever confirmed: true
OS: Windows XP → All
QA Contact: general → preferences
Hardware: x86 → All
Version: unspecified → Trunk
So basically wontfix it would seem.
I don't think so.

Quoting Bryan in bug 469303 comment #2:
> it would be great to move to points instead of pixels

Users are more familiar with points than pixels and we should have the means
to do the conversion (known dpi).
Why wouldn't you fix such a design flaw?

If a user moves or uses a T-bird (or FF) profile to another computer, that has a different DPI, if their DPI is set correctly, then values stored in points would produce fonts of the same size on each computer?

Wouldn't this make FF/TB less friendly?  More techo-nerdish?
I don't claim any deep knowledge of this, but if i read the discussion in bug 186718 correctly it's more complex than that.

Re users familiarity with px vs pt, i doubt there's significant difference.
re users familiarity with px vs. pt.

points can be converted to 'SI' units.

they are a physical measurement with constant value across all monitors.

'pixels' are not convertible to SI units.

See: http://www.onlineunitconversion.com/length_all.html
and put in "points(US), and output "millimeters" (or inches,...whatever).

There are 4 listed definitions for points -- one in france, one for US/UK,
one used by LaTeX and one used by Adobe.

All but the French one are within +/- 72DPI.  (Adobe is exactly 72 DPI, which
would be fine with me).

If you want -- w/SI -- some users in *some* countries will know what you mean
when you specify characters in fractions of a mm.

But I'd recommend the interface allow the user to switch to specifying at least in points (using 72/DPI for value), and, optionally, specifying in mm (or microns if youwant to use non-fractional numbers)...

Users may know how many pixels their monitor is (but my parents don't, I know that for a fact), but that won't tell them what size a font will display at on their monitor AND their parents (assuming they don't have their parent's monitor DPI memorized...:-))...

That's the benefit of using points -- they are not dependent on your monitor.
They have a physical, real-world, fixed size that is constant, regardless of monitor.
I wish this could be brought up for more importance.

pixels are "relative widths" -- and won't ever allow web pages to be designed for a wide range of audiences.

points have a defined meaning in CSS of 1/72th of an inch.

Legal CSS values for absolute widths are points(pt), picas(pc), inches(in)
centimeters(cm) and mm(millimeters).

For font sizes, points are most commonly used in the US.

For paper, either 'cm or in', I'd vote for 'cm' as a standard, but why not
allow it to be configurable -- on block level items, default to cm or mm, 
and on font sizes they could choose points or mm so 12pt type would be
about the same as 4mm type (4.233mm)

This needs to be fixed in ffox & tbird..

-linda
At least add a unit to the preference interface so that users know that the value is in pixels instead of points. A user may set plain text font to a default of 10 and minimum font size to 9, thinking this is points instead of pixels. For many people, this can make font difficult if not impossible to read.

When setting pixels per inch on many computers a common value is 96, but this can vary between 72 and 128 on some displays.
I would suggest closing this since an enormous number of websites have broken layout when fonts do not have the expected _pixel_ sizes. Using page zoom and putting up with pixellated raster graphics seems like the best option available at the moment. (Follow-up: calculate a suitable zoom factor based on the computer's DPI setting, since 100% with 100 DPI fonts is totally unusable on 200+ DPI screens.)
Gecko always uses 96dpi now, so the only difference between px and pt is a constant scalar factor. We should probably resolve this bug as INVALID now, due to the Gecko changes.
The point was to be able to be able to have font sizes look constant across displays.

If you think that problem is solved, sure, close it.  From the post just before yours, it doesn't sound like it.

a dot is no longer a physical measurement related to screen resolution -- it is supposed to be 1/96th of an inch (in HTML5).  The question is, how do I tell anything (browser, OS, etc), what an "inch" is.  Windows had (maybe still does) a display of 2 lines and asked you
to measure how far apart they were -- then they could calculate the 'zoom' factor 
mozilla @ dhardy mentioned.. 

The only place that gets it right, now, Windows and my "X" display --- windows because it lets me enter a physical value or zoom, and X because when my X server starts it reads the windows value out of the registry.  

Thus a 12pt font on windows looks the same size as a 12pt font displayed in vi (within roundoff errors).   -- Specifically, gvim running native w/a given font size will look the same as the gvim running w/X on a remote host.

What I do on FF/TB now, is I set layout.css.devPixelsperpx -  which works for Windows, but not on linux w/X as linux w/X has it's own idea of dpi (usually set when the server starts, but I've seen it reset while the server is still up too -- not sure how that is done yet).

Likely 2 different solutions (at least) will be needed for Win vs. X, since X already has
a dpi setting that blows up text and widgets..  Windows has always been the problem area.
But the devPixelsperpx -- maybe exposing that value on Windows?... as it's the ratio of the "96dpi" sized pixels to "real device pixels"...
Severity: major → normal

What do you think, close per comment 10?

Flags: needinfo?(alessandro)
See Also: → 918063

I think so. 8+ years ago, a thing like HiDPI wasn't on the radar. Apps have updated or adjusted to higher DPI / 4K/ 8K since then.

It wasn't, about'HiDPI', but that font sizes cannot, port-ably, be specified in pixels (how many pixels/inch are there on a printed page). Depite Gecko's adoption of the HTML2 standard of 96dpi, that is not a portable standard for font specification that would be independent of
OS, gui-lib, device, media, etc. It's also a dangerous adoption since the, still, widely used 'X-display' on linux + android uses dots/inch values that scale with the display (and not 96dpi). This will be more evident on display systems that support separate DPI values for different
displays -- i.e. if you have 1 display w/96dpi, and a handheld that operates as a separate display/controller for computer functions using a 200-300 dpi display, you are likely to get widely different text sizes if you rely on the Win/HTML2 value of 96dpi being constant.

Fonts in publishing apps on computer (like Adobe et al.) use 'Points' (adobe fixes their size at 72 Points/inch) or about 3/4 the size
of pixels used on windows or HTML2 (HTML2 redefined pixels to be 96dpi in HTML2). But that's not portable to other types of displays
Retina-based, X-windows based, printed paper). Specifying font sizes, in publishing apps where what you see on the screen should look
the same when printed on paper (WYSIWYG) is done most often in "Points".

That's still the case, because pixel still has a different meaning on different devices, OS's, and output media. A unit like the 'point' is
based on a fixed amount of ~72/inch. There is no widely accepted metric standard, however, allowing 'mm' to values of .1mm would
likely be satisfactory for German and Japanese standards.

If fonts could be specified in either Points or mm with input values allowing 1 digit to the right of the decimal point, that would be
both device+medium independent as well as internationally acceptable/useful. I would use 72Points/254mm for converting
between the two.

FWIW, if you want to use the HTML2 pixel size of 96dpi, you should annotate the input box as using HTML2's pixel definition of 96/dpi. That would be the easy way out, but it certainly would put Moz-apps' font specs in line with publishing apps to allow for similar units. I certainly wouldn't be against allowing a choice that included HTML2's pixel def (i.e. choice among the 3), but given that HTML2 is artificially defining a pixel @ 96/inch, I certainly wouldn't put that as the only choice.

Sorry, but I'm not an expert on this as I don't really know how Gecko handle this situation, especially after 8 years.
For what I'm aware, Thundebird properly uses the font size of the OS/Desktop of the user.
The main issue that I'm not sure is related to this bug is the inability for the user to scale up and down the font of the entire application independently without relying on add-ons.
In general, the base font size should come from the OS, and then the rest of our declared font size in CSS should be declared as rem in order to scale accordingly to the root item.

Flags: needinfo?(alessandro)
Severity: normal → S3

This bug is not relevant anymore as the work on global font size and relative units have been implemented and more improvements are coming.

Status: NEW → RESOLVED
Closed: 5 months ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.