Image Basics - digital images

  Overview of Terminology and how halftones are digitally produced

ppi: pixels per inch. Describes the digital pixel resolution of an image. The term dpi is often used to describe the digital resolution but this is inappropriately used. Scanning devices are sometimes advertised with their scanning resolution expressed in dots per inch. Strictly speaking, this is an incorrect use of the term ‘dpi’, because input devices like scanners produce pixels only output printers produce dots. However, it’s fallen into common practice now and only adds to the confusion. Monitor resolution is specified in ppi: Macintosh monitors have a resolution of 72ppi, whereas PC monitors are usually 96ppi. One of the reasons for the Mac default resolution of 72ppi is that it helps graphic designers get a better feel for the weight of their forms when laying out a page, as there were 72 points to an inch.

dpi: dots per inch. Refers to the resolution of a printing device. An output device such as an imagesetter (i.e. excalibur X-45) is able to produce tiny 100% black dots at a specified resolution. The X-45 is capable of a resolution of 2400dpi and say the printer wished to use a screen ruling of 150 lines per inch. If you divide the dpi of 2400 by the lpi of 150 you get a figure of 16. Within a matrix of 16 x 16 printer dots, an imagesetter can generate a halftone dot varying in size from 0 to 256 print dots. see Fig 1. This also is known as levels of grey in grayscale i.e 256 levels of grey. At 1200 dpi with a line screen ruling of 133 you will get 81 levels. 1200 divided by 133 = 9 giving a dot matrix of 9 x 9 = 81.

lpi: lines per inch. The number of halftone lines or ‘cells’ in an inch, also described as the screen ruling. The origins of the term go way back before the days of digital desktop publishing. Logically the correct term should be pixels per inch, but the mixed term of dpi & lpi has no clear definative answer. It is an example of how two separate disciplines of traditional repro and those who developed the digital technology have chosen to apply different meanings to these terms.


ht cell

Halftone dot made up of pixels

Each halftone dot is rendered by a PostScript RIP from the pixel data and to an output device called an imagesetter, (lazer printer platemaker etc). The halftone dot shown here is plotted using a 16 x 16 dot matrix. This matrix can therefore reproduce a total of 256 shades of grey 16x16=256 (this relates to the 0-256 levels dialog box in photoshop) how many of the haltone cells are filled with black or color determines the shades, a possible 256 difrents grays in the case of black

The dpi resolution of the imagesetter, divided by 16, will equal the best line screen resolution. 2400dpi divided by 16 = 150lpi (others can be used ie 133 and 100 etc but 150 will display the most levels of grey.)

Tip: the 72 points of PostScript is slightly larger than the traditional printers Hawks point. (Hawks point = .013838” PostScript point = .013888”, PostScript works in pixels not inches, this can cause rounding up/down problems in conversion over large areas

Of Interest

The History of the Printer’s Point Simon Fournier proposed a standardized system of 72 points per inch in 1737, and published a printed scale for reference. The published tract created confusion among printers because as the printed pages dried, the printed scale shrank in size! The published scale was unfortunately used as a reference, and therefore printers and font makers suffered from the creation and use of inconsistent tools and measures from that time on. In 1770 Francois Didot proposed a solution to this problem by defining a point as exactly 1/72 of a French inch. (The French inch and the English inch are not the same—a French inch is equal to 1.0638 English inches.) This measurement quickly gained popularity in France and across Europe, and until the 20th century the Didot point remained the standard continental measurement.Things weren’t as clear-cut in Britain and in the United States. Standardization in those countries didn’t come about until after the Chicago fire of 1871, which destroyed the foundry of Marder, Luse & Co. The foundry was forced to rebuild all of its molds from scratch, and commissioned Nelson Hawks to size their types. Hawks fixed the point at .013838 inch (basing his measurement on a popular pica type of the day); this became the standard typographer’s point until the development of computerized typography. If one Hawks point equals .013838 inch, then 72 Hawks points equals .996336 inch—not an elegant sum for computerized graphics. When PostScript was created by Warnock and Geschke in the 1980’s, a point was defined as exactly 1/72 of an inch (.013888”), which is a much better value for computations. The PostScript point has since become the standard point size for all computer graphics applications.

  Return to Contents --- Return to Photoshop contents --- Return to Home page