Type sizs on Mac and PC

  • Thread starter The Devil's Advocate©
  • Start date
B

Bernhard Sturm

Toby said:
You have still not answered my main point: what would you do in a
situation where the size on an image needed to relate to the size of the
font?

I have said that there are three different solutions:

1. specify a fixed font size;
2. specify a scaled image size; or
3. fix the design so that this is not an issue

And repeatedly stated that #3 is the best solution, but that number #2
should be taken in cases where #3 is impossible.

Why? Because I'd rather have a slightly blurry graphic than completely
illegible text.

I suggest for you another solution #4:
4. design your site in flash then you have exactly what you want, and
you don't have to tweak XHTML (I wonder how your page got validated with
an open <img> tag).

(but I will not be on your boat ;-)

bernhard
 
S

SpaceGirl

Toby A Inkster said:
Why so? They certainly create an image that is exactly the same height as
the font.

Which, as discussed, is not actually possible. So once again...


(this thread is going to go on forever isn't it)
 
S

SpaceGirl

Bernhard Sturm said:
I think we have here two fractions:
- 'Flesh & Bone'-Designers, who made the experience that fixed font size
offer them the beauty to control their overall design, giving them the
freedom to design a page where font-size is in harmony with other style
elements. 'Flesh & Bone''s are aware of the fact, that the (IE only)
users are limited in re-sizing their designs, but they would happily add
a second CSS covering a bigger (fixed) font-size, conserving again their
sacred design harmony.

- 'Content-is-King'-Designers, who tend to be more on the content side,
have the strong and valid point of view, that content rules over design.
They are strong believers that barrier-free webdesign is more than just
a word, and they build up their designs from scratch by using relative
font-sizes, and let the user the freedom to change the harmony of text
and style elements on their pages. 'Content-is-King' designers do look
at their pages as a mean of transporting information, not as 'art' or
any other esoteric notion. Users want information, as readable as
possible, and as quick as possible. On the other hand, 'Content-is-King'
designer take a great effort in trying to get the 'Flesh&Bone' harmony
between typography and style elements with the 'standard' size of their
font-size.

Take your pick ;-)
I myself would like to see me in the 'Flesh&Bone' part, but depending on
the site I wander to the 'Content-is-King' fraction.


Perfect. I rock between the two depending on the client and the project.
 
T

Toby A Inkster

Bernhard said:
4. design your site in flash then you have exactly what you want, and
you don't have to tweak XHTML

Arguably even worse than #1.
(I wonder how your page got validated with an open <img> tag).

Because in HTML, the <img> element's closing tag is forbidden.
 
B

Bernhard Sturm

Toby said:
Bernhard Sturm wrote:




Arguably even worse than #1.




Because in HTML, the <img> element's closing tag is forbidden.

yes you are right. I was wondering as you are using HTML4.01, which
equals to XHTML1.0. and there you have to close the <img> tag.


bernhard
 
E

Eric Bohlman

But the physical size of that pixel changes as the screen and/or monitor
size changes. The pixel size at 1280x1024 on a 17-inch monitor is
seriously smaller than on a 21-inch monitor. Thus a 10px font size is
perceived quite differently between the two.


Which leads to the (readability) problem of fixed font-sizes.

Indeed. The fact that pixel sizes varies according to factors that are
completely beyond the control of a Web designer means that if the designer
attempts to fix font size, he will get the exact opposite of the effect
he's trying to achieve; he will *increase*, rather than reduce, the amount
of variation in the size of text as viewed by his audience. Paradoxically,
the best way to get uniformity in look across the Web is to fix as little
of it as possible.
 
K

kchayka

SpaceGirl said:
*shakes her head in despair*

I thought bands were fickle. Their bloody fans are ten times worse.

Or perhaps you don't know the target audience as well as you thought you
did. ;)
 
K

kchayka

Bernhard said:
that's perfectly true, and I assume that for one reason the user has set
it's resolution to this size. I can also assume, that he or she is then
also able to read thise smaller fonts.

You assume too much, me thinks. Every application I run has some kind
of option for text size, usually zoom or a default size. This
compensates quite nicely. Font sizes in the UI can be adjusted in
system settings, if need be. There is no reason to squint.
It was always a mistery to me, that people want bigger
resolutions in order to see MORE content, but then complain, that they
can't read the content anymore, because the font-size is too small,

I haven't experienced that at all. The only problem I have is with web
pages that use absolute font sizes. They are invariably set stoopidly
small.
setting the resolution to a lower rate gives them less content, hence
you will always loose.

Yes, once I experienced the benefits of a higher resolution, there was
no turning back. I won't change my screen size just because a few web
pages are poorly designed. I would rather those designers saw the light
and corrected their mistakes.
there are so many machines
and browsers out there: my PDA Psion Revo's Opera displays em's and %'s
or pt's (let alone mm's) different, than my Konqueror on my Linux
machine, and IE does interpret em's slightly different than other
browsers.

So what if they're rendered a little differently? The whole point (no
pun intended) of em and % units is that they scale to the user's default
font size. If your only concern is that your precious design is
maintained, then forget HTML and CSS and switch to Flash. But why not
show a little respect for your users, eh, and give them a more flexible
design?
But I agree, maybe pixels are not the best way to get rid of the
font-size problem

Like I said, there is no problem except in your own mind.
 
B

Bernhard Sturm

Owen said:
^^^^^^^^^^^^^^^^^^^^^^^^

No.

yes you are right, I was wrong. I was just reading the specs of XHTML1.0
vs HTML4.01 again. And indeed HTML4.01 allows open tags ;-), it just
pops to my eyes everytime I see 'not-wellformed' HTML *g*
 
B

Bernhard Sturm

Steve said:
Are you sure? Photoshop tells me that Toby's image is 72 dpi.

strange and FW MX2004 tells me that Toby's image is at 28dpi (256px x
256px @ 28dpi = 23cm x 23cm)

why is this difference? I only use photoshop for print image
manipulation never for the web as there are better tools for image
optimisation than PS (IR, FW)...
Anybody familiar with this effect?
Anyway, the resolution of any image is totally meaningless on the web,
unless a user downloads the image to print out from a graphics program
the resolution never comes into play at all.

why is the resolution meaningless? I always thought that it really makes
a difference if I put an image with 300dpi (50px x 50px) or an image
with 72dpi (50px x 50px) on a website (size matters ;-). as the user has
to download different sizes....
even more confused, isn't this like this anymore?

bernhard
 
B

Barry Pearson

Bernhard said:
Steve Pugh schrieb: [snip]
Anyway, the resolution of any image is totally meaningless on the
web, unless a user downloads the image to print out from a graphics
program the resolution never comes into play at all.

why is the resolution meaningless? I always thought that it really
makes a difference if I put an image with 300dpi (50px x 50px) or an
image with 72dpi (50px x 50px) on a website (size matters ;-). as the
user has to download different sizes....
even more confused, isn't this like this anymore?

A digital image comprises an array of pixels. A pixel is the smallest unit of
the image, and in effect represents a square of the final display (at least
for still images). A pixel requires storage space to hold the binary numbers
for the colours. (Some file formats need about 3 bytes per pixel. But web
images nearly always need significantly less than this). Then, by default,
each pixel gets mapped onto a unit of the screen. (That is somewhat
simplified! I have a URL below that goes into a bit more detail).

A 50px by 50px image has a size in bytes dictated by the need to represent the
colours of those 2500 pixels. Any other numbers such as a ppi or pixel per
inch number (or, a strange variant such as dpi or dots per inch) doesn't make
a difference to the number of bytes. Nor does it make a difference to the way
the image is mapped onto the screen on the web. Think of those numbers (if
indeed they exist in the digital image file, and are not just something that
the package that created it talks about) as little more than comments or
preferences.

The best analogy I have been able to think of is to consider that pixels are
valuable things like the words in your book. They cost resource to create in
the first place, and need storage space. They can sometimes be sold. A
photo-editor can sensibly be thought of as a "pixel processor", corresponding
to "word processor".

But ppi and dpi, etc, are a bit like the "font size" of your book. They are a
current preference which can be overriden when you want to, and some formats
don't even hold it. The ppi or font size finally used when printing makes a
different to the size of words and pixels on the final medium, and hence to
the total size of the final printed version. But the value used may not be one
stored in the file. (I tell Photoshop what size in centimetres to print my
photographs).

Some people can manage to work with ppi and dpi for particular purposes,
because they use packages designed for that purpose that help them do so. But
once you start to move digital images from one place to another, and/or use
them for the web, if you don't think of them simply as arrays of pixels you
may get very confused.

I have started to write this up. Here is my first page on the subject. It was
written last month for the sake of someone else who thought file sizes took
into account ppi & dpi values instead of just being based on the pixel-count.
http://www.barry.pearson.name/articles/digital_images/examples.htm
 
S

SpaceGirl

kchayka said:
Or perhaps you don't know the target audience as well as you thought you
did. ;)

Perhaps. Take a look at www.subhuman.net - this was a fun site my partner
and I did. It's sorta experimental. Anyway, we have around 12,000 unique IPs
tracked though that site A DAY.

We must be getting something right.
 
B

Bernhard Sturm

Some people can manage to work with ppi and dpi for particular purposes,
because they use packages designed for that purpose that help them do so. But
once you start to move digital images from one place to another, and/or use
them for the web, if you don't think of them simply as arrays of pixels you
may get very confused.

I have started to write this up. Here is my first page on the subject. It was
written last month for the sake of someone else who thought file sizes took
into account ppi & dpi values instead of just being based on the pixel-count.
http://www.barry.pearson.name/articles/digital_images/examples.htm

uff.. and I always thought that the resolution is device dependent, but
as you clearly showed me, this is not the case. so I can gladly put my
50x50 pixel image at 300dpi on my web site.
</irony>

your explanation is right concerning the way how images are built, but
you missed one very important point: the physical resolution of your
displaying device. It's all correct with your theory as long as you are
not using any displaying device.
Because if you say the word 'pixel' then you also have to tell how big
in the real world your 'picutre element' is (that's a pixel). is it
0.01mm or 1mm or 0.002mm? Your explanation is like in geometry where you
say a point has no dimension. Because it's just a 'point' in space. not
connected to any physical properties. but if you draw a point on a piece
of paper you are no longer in your theory. because you have now a
visible, hence measureable, point...
and that's where you get the pixels/inch (or more correct samples/inch)
resolution from. every displaying device (be it paper, be it a monitor)
has a certain physical features. These features limit the amount of
pixels per visible area. For a monitor this is usually 72pixels per inch
(ppi or dpi (for your inkjet printer, but careful this is NOT the same
as your monitor dpi's).. for printing purposes (laser printers) the
correct term is lpi (lines per inch)).
On an average monitor you can display 72 dots per inch. This means, that
one inch contains 72 dots. But there are displays (high-end graphic
displays, avionic displays) that have a significant higher rate of
pixels per inch (and these displays are also much more expensive as the
manufacturing process is much more complicating). Now let's assume we
have a display with 144 dpi. If you would display your original image
with those 72 pixels per inch your visible image would then only be half
an inch. So it seems to make a difference, if not in dimension, then in
size. ;-)

And don't tell me that you can always put a 300dpi image with 50px x
50px on a screen, but on a printer this would be a perfect 5cm x 5cm
image ;-)

One source, that explains this a bit better:
http://graphicdesign.about.com/library/weekly/aa070998.htm

HTH
bernhard

www.daszeichen.ch
remove nixspam to reply
 
B

Bernhard Sturm

And don't tell me that you can always put a 300dpi image with 50px x
50px on a screen, but on a printer this would be a perfect 5cm x 5cm
image ;-)

One source, that explains this a bit better:
http://graphicdesign.about.com/library/weekly/aa070998.htm

HTH
bernhard

www.daszeichen.ch
remove nixspam to reply

I stand corrected here (and hope this thread will be closed with this
either ;-)
Barry was quite correct with his post. Yes, I confess, I learned
something. But as I always swap between print and screen for me 'dpi's
are of concern, but it's obvious that it doesn't matter as long as you
stick to display devices having pixels as units (as long as you are not
going to measure their real physical size on a screen... yes, yes...

cheers
bernhard
 
B

Barry Pearson

Bernhard said:
Barry Pearson schrieb: [snip]
Some people can manage to work with ppi and dpi for particular
purposes, because they use packages designed for that purpose that
help them do so. But once you start to move digital images from one
place to another, and/or use them for the web, if you don't think of
them simply as arrays of pixels you may get very confused.

I have started to write this up. Here is my first page on the
subject. It was written last month for the sake of someone else who
thought file sizes took into account ppi & dpi values instead of
just being based on the pixel-count.
http://www.barry.pearson.name/articles/digital_images/examples.htm

uff.. and I always thought that the resolution is device dependent,
but as you clearly showed me, this is not the case. so I can gladly
put my 50x50 pixel image at 300dpi on my web site.
</irony>

You can put it on your web site. But what does that "300dpi" mean?
your explanation is right concerning the way how images are built, but
you missed one very important point: the physical resolution of your
displaying device. It's all correct with your theory as long as you
are not using any displaying device.

As I type this, and when I browse the web, I see it on 2 display devices
simultaneously. (I run with a calibrated CRT monitor displaying at about 90
pixels per inch attached to my laptop which displays at about 117 pixels per
inch on its TFT screen). I could, of course, instead be projecting it onto a
screen at 1 pixel per inch! And no part of the system would have a clue about
these ppi values, or care about them.

That should illustrate the futility, indeed the lack of meaning, when talking
about ppi (or even worse dpi) in connection with display devices. And forget
any assumption that screens work at 72 ppi. That was once true on Macs, and
you can find the odd laptop that has that value on its built-in screen (which
would be different from an attached screen). But modern screens tend to have
much higher ppi values, and some appear to be climbing towards 200 ppi. And
browsers don't care!

Please note that I am not simply talking theory. I have lots of experience
over the years of scanning, of using images from digital cameras, building
images in various packages, displaying them on screens, printing them as
various sizes, moving them one package to another, combining them, etc. I am
speaking about what works in practice. If you want to check me out a bit more,
try the following thumbnail gallery and click on the "info" links. You will
see that I do understand these issues.
http://www.barry.pearson.name/photography/portfolios/lrps.htm
Because if you say the word 'pixel' then you also have to tell how big
in the real world your 'picutre element' is (that's a pixel). is it
0.01mm or 1mm or 0.002mm?

No I don't have to say how big it is! It is a pixel in a digital image. It
represents (in effect) a square of the scene. That pixel, during its life, may
be viewed at many different resolutions, sometimes simultaneously. And when it
isn't being viewed, it is some bits on a disc somewhere. It no more has a size
in the real world than a word in a "DOC" file has a size in the real world.
(There may simply be a preference in the form of a font size).

At the instant that it is being printed, there is indeed a ppi value. Because
there are pixels in the image, and these are being spread over inches. The
value exists for that print, but may be different for the next one. The value
may be in the file format for the digital image, or may be temporary. (I
believe some file formats don't hold a ppi value in them? Does PNG? I couldn't
find it).
Your explanation is like in geometry where
you say a point has no dimension. Because it's just a 'point' in
space. not connected to any physical properties. but if you draw a
point on a piece of paper you are no longer in your theory. because
you have now a visible, hence measureable, point...

Indeed, for each transfer to physical media, there is a ppi value (which may
be different each time).
and that's where you get the pixels/inch (or more correct
samples/inch) resolution from.

"Sample per inch" is not correct! It is used within the scanning community,
although rarely, I believe. (I have not come across scanning software that
uses spi, but there may be some. Most appear to use dpi, which is pretty
silly, but I guess it is historical sloppy terminology. My scanner uses ppi,
which is exactly what I want it to do).
every displaying device (be it paper,
be it a monitor) has a certain physical features. These features
limit the amount of pixels per visible area. For a monitor this is
usually 72pixels per inch (ppi or dpi (for your inkjet printer, but
careful this is NOT the same as your monitor dpi's).. for printing
purposes (laser printers) the correct term is lpi (lines per inch)).

For monitors: see above about 72ppi. And are you talking about phosphor dots
when you mention dpi? Is that confusing altenative terminology for ppi?
On an average monitor you can display 72 dots per inch. This means,
that one inch contains 72 dots. But there are displays (high-end
graphic displays, avionic displays) that have a significant higher
rate of pixels per inch (and these displays are also much more
expensive as the manufacturing process is much more complicating).
Now let's assume we have a display with 144 dpi. If you would display
your original image with those 72 pixels per inch your visible image
would then only be half an inch. So it seems to make a difference, if
not in dimension, then in size. ;-)

Indeed. And W3C has a proposal to deal with this, although I am not aware of
any browser that implements it. And I suspect that unless they scaled up
images and text in the same way, things would break.
http://www.w3.org/TR/REC-CSS2/syndata.html#length-units
And don't tell me that you can always put a 300dpi image with 50px x
50px on a screen, but on a printer this would be a perfect 5cm x 5cm
image ;-)

One source, that explains this a bit better:
http://graphicdesign.about.com/library/weekly/aa070998.htm

I am aware of that source, and it is one of the reasons I want to write my own
pages to clear up some of the issues it propagates. For example:

Although it is OK with spi for scanning, I believe for most purposes ppi is
better. After all, what you are trying to do is get an array of pixels in
memory. I normally scan at 4000ppi, so from a mounted 35mm slide I get 5590 x
3700 pixels. Yes, it samples at 4000spi. But in fact even if it sampled at
8000spi, I would still want it to deliver 4000ppi - because that is how many
pixels I want. So spi appears to be more about how it is implemented rather
than giving me what I actually want.

Although it is often OK in principle to talk about dpi in connection with
printing, in practice there is a lot of confusion. Sometimes people are
talking about having enough pixels for the printing process. So they may be
concerned about getting (say) 300ppi. Sometimes, however, people start talking
about dots of ink or droplets on the paper. Inkjet printers (or at least
photographic quality ones, such as the Epson 2000P that I use) have a highly
variable algorithm for putting droplets of various sizes onto the paper. And
sometimes getting too involved with ppi or dpi causes people to resample their
digital image (up or down) to get 300ppi. But that may be the wrong thing to
do. It may be better simply to let the printer driver do the job. And it will
then resample to its native resolution, which may not be anything that the
user is aware of. (I believe my HP 950C is 600ppi, while my Epson is 720ppi).

The problem is not that what people say is wrong in a local context. After
all, if your scanning software insists on talking about dpi, you have to
accept that when dealing with it.

But when you look at how every component of a complex digital imaging system
fits together, you need an architectural concept that works across the whole
system, then map it onto local terms when forced to. The fundamental concept
is that of a digital image comprising an array of pixels, each representing a
square of the image. Anyone who understands that can work out how to deal with
local cases if forced to. It is the only common concept.
 
T

Toby A Inkster

Bernhard said:
On an average monitor you can display 72 dots per inch.

That's a bit of a simplistic, and not necessarily realistic, model.

I have a 17" monitor. We all know that these terms are just sales jargon
and the actual viewable area of a 17" monitor is just under 16". In my
case, this works out as about 12.5" wide by 9.4" high.

If I run my display at 1600x1200 resolution, this works out as 1600 dots
divided by 12.5 inches: 128 dots per inch.

If I run my display at 800x600 resolution, it's 64dpi. At 640x480, it's
51dpi.

In actual fact, I tend to run my display at 1280x1024, or 102dpi.

Being technically minded, I have configured my OS so that it *knows* I'm
at 102dpi, so if it needs to display a 3 inch line, it knows that it
should display a 306 pixel line.

However, most people aren't technical enough to configure that, so will
leave their dpi set to their OS default, which is 72dpi on Mac, 96dpi on
Windows and 75dpi or 100dpi on Linux/Unix.
 
F

Firas D.

SpaceGirl said:
Perhaps. Take a look at www.subhuman.net - this was a fun site my partner
and I did. It's sorta experimental. Anyway, we have around 12,000 unique IPs
tracked though that site A DAY.

We must be getting something right.
It's pretty good... except the pink.. and the lipstick.. lol.
 
M

Mark Parnell

in theory, it's individual.
practically, it's the designer.

Hardly. No matter how hard you try, I can override your font size
settings. The ultimate power always rests with the end user.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,583
Members
45,075
Latest member
MakersCBDBloodSupport

Latest Threads

Top