Bernhard said:
Barry Pearson schrieb: [snip]
Some people can manage to work with ppi and dpi for particular
purposes, because they use packages designed for that purpose that
help them do so. But once you start to move digital images from one
place to another, and/or use them for the web, if you don't think of
them simply as arrays of pixels you may get very confused.
I have started to write this up. Here is my first page on the
subject. It was written last month for the sake of someone else who
thought file sizes took into account ppi & dpi values instead of
just being based on the pixel-count.
http://www.barry.pearson.name/articles/digital_images/examples.htm
uff.. and I always thought that the resolution is device dependent,
but as you clearly showed me, this is not the case. so I can gladly
put my 50x50 pixel image at 300dpi on my web site.
</irony>
You can put it on your web site. But what does that "300dpi" mean?
your explanation is right concerning the way how images are built, but
you missed one very important point: the physical resolution of your
displaying device. It's all correct with your theory as long as you
are not using any displaying device.
As I type this, and when I browse the web, I see it on 2 display devices
simultaneously. (I run with a calibrated CRT monitor displaying at about 90
pixels per inch attached to my laptop which displays at about 117 pixels per
inch on its TFT screen). I could, of course, instead be projecting it onto a
screen at 1 pixel per inch! And no part of the system would have a clue about
these ppi values, or care about them.
That should illustrate the futility, indeed the lack of meaning, when talking
about ppi (or even worse dpi) in connection with display devices. And forget
any assumption that screens work at 72 ppi. That was once true on Macs, and
you can find the odd laptop that has that value on its built-in screen (which
would be different from an attached screen). But modern screens tend to have
much higher ppi values, and some appear to be climbing towards 200 ppi. And
browsers don't care!
Please note that I am not simply talking theory. I have lots of experience
over the years of scanning, of using images from digital cameras, building
images in various packages, displaying them on screens, printing them as
various sizes, moving them one package to another, combining them, etc. I am
speaking about what works in practice. If you want to check me out a bit more,
try the following thumbnail gallery and click on the "info" links. You will
see that I do understand these issues.
http://www.barry.pearson.name/photography/portfolios/lrps.htm
Because if you say the word 'pixel' then you also have to tell how big
in the real world your 'picutre element' is (that's a pixel). is it
0.01mm or 1mm or 0.002mm?
No I don't have to say how big it is! It is a pixel in a digital image. It
represents (in effect) a square of the scene. That pixel, during its life, may
be viewed at many different resolutions, sometimes simultaneously. And when it
isn't being viewed, it is some bits on a disc somewhere. It no more has a size
in the real world than a word in a "DOC" file has a size in the real world.
(There may simply be a preference in the form of a font size).
At the instant that it is being printed, there is indeed a ppi value. Because
there are pixels in the image, and these are being spread over inches. The
value exists for that print, but may be different for the next one. The value
may be in the file format for the digital image, or may be temporary. (I
believe some file formats don't hold a ppi value in them? Does PNG? I couldn't
find it).
Your explanation is like in geometry where
you say a point has no dimension. Because it's just a 'point' in
space. not connected to any physical properties. but if you draw a
point on a piece of paper you are no longer in your theory. because
you have now a visible, hence measureable, point...
Indeed, for each transfer to physical media, there is a ppi value (which may
be different each time).
and that's where you get the pixels/inch (or more correct
samples/inch) resolution from.
"Sample per inch" is not correct! It is used within the scanning community,
although rarely, I believe. (I have not come across scanning software that
uses spi, but there may be some. Most appear to use dpi, which is pretty
silly, but I guess it is historical sloppy terminology. My scanner uses ppi,
which is exactly what I want it to do).
every displaying device (be it paper,
be it a monitor) has a certain physical features. These features
limit the amount of pixels per visible area. For a monitor this is
usually 72pixels per inch (ppi or dpi (for your inkjet printer, but
careful this is NOT the same as your monitor dpi's).. for printing
purposes (laser printers) the correct term is lpi (lines per inch)).
For monitors: see above about 72ppi. And are you talking about phosphor dots
when you mention dpi? Is that confusing altenative terminology for ppi?
On an average monitor you can display 72 dots per inch. This means,
that one inch contains 72 dots. But there are displays (high-end
graphic displays, avionic displays) that have a significant higher
rate of pixels per inch (and these displays are also much more
expensive as the manufacturing process is much more complicating).
Now let's assume we have a display with 144 dpi. If you would display
your original image with those 72 pixels per inch your visible image
would then only be half an inch. So it seems to make a difference, if
not in dimension, then in size. ;-)
Indeed. And W3C has a proposal to deal with this, although I am not aware of
any browser that implements it. And I suspect that unless they scaled up
images and text in the same way, things would break.
http://www.w3.org/TR/REC-CSS2/syndata.html#length-units
And don't tell me that you can always put a 300dpi image with 50px x
50px on a screen, but on a printer this would be a perfect 5cm x 5cm
image ;-)
One source, that explains this a bit better:
http://graphicdesign.about.com/library/weekly/aa070998.htm
I am aware of that source, and it is one of the reasons I want to write my own
pages to clear up some of the issues it propagates. For example:
Although it is OK with spi for scanning, I believe for most purposes ppi is
better. After all, what you are trying to do is get an array of pixels in
memory. I normally scan at 4000ppi, so from a mounted 35mm slide I get 5590 x
3700 pixels. Yes, it samples at 4000spi. But in fact even if it sampled at
8000spi, I would still want it to deliver 4000ppi - because that is how many
pixels I want. So spi appears to be more about how it is implemented rather
than giving me what I actually want.
Although it is often OK in principle to talk about dpi in connection with
printing, in practice there is a lot of confusion. Sometimes people are
talking about having enough pixels for the printing process. So they may be
concerned about getting (say) 300ppi. Sometimes, however, people start talking
about dots of ink or droplets on the paper. Inkjet printers (or at least
photographic quality ones, such as the Epson 2000P that I use) have a highly
variable algorithm for putting droplets of various sizes onto the paper. And
sometimes getting too involved with ppi or dpi causes people to resample their
digital image (up or down) to get 300ppi. But that may be the wrong thing to
do. It may be better simply to let the printer driver do the job. And it will
then resample to its native resolution, which may not be anything that the
user is aware of. (I believe my HP 950C is 600ppi, while my Epson is 720ppi).
The problem is not that what people say is wrong in a local context. After
all, if your scanning software insists on talking about dpi, you have to
accept that when dealing with it.
But when you look at how every component of a complex digital imaging system
fits together, you need an architectural concept that works across the whole
system, then map it onto local terms when forced to. The fundamental concept
is that of a digital image comprising an array of pixels, each representing a
square of the image. Anyone who understands that can work out how to deal with
local cases if forced to. It is the only common concept.