Lost and Font
Some say that Johannes Gutenberg, a German inventor, craftsman, and goldsmith, invented the printing press in the mid-15th century (1440) in Mainz, Germany, printing the Gutenberg Bible in 1455. However, the Chinese (Bi Sheng, from Bianliang--the old capital in the North Song Dynasty, now Kaifeng) actually invented movable type printing, and what Gutenberg did was doing some extra work to create a pressing mechanism using wood. (There is evidence supporting this--the oldest printed book/scroll was found in Dunhuang, China, named the Diamond Sutra. It was a Buddhist text.) Gutenberg made durable letter blocks that could sometimes take an entire day to arrange onto the base (as per the techniques of movable type printing), and the workers would print the papers in mass production. The blocks were based on Blackletter calligraphy (NY Times font--1967 by Edward Benguiat), commonly used to write manuscripts. The catch was that these letters confined the space on a page, leading to longer books and taking more time to print.
Nicolas Jenson recognized in 1470 that simpler letterforms could lessen the time used to print, creating the first Roman typeface. It is the basis for many modern fonts, such as Centaur (Bruce Rogers, 1914) and Adobe Jenson (Robert Slimbach, 1996).
​
The first italic typeface was invented in 1501 by Aldus Manutius and Francesco Griffo, allowing for more space on a page. Despite its original purpose, they are still used to emphasize text.
​
In 1734, William Caslon invented a new typeface, now referred to as "Old Style", to better distinguish the letters.
In the 1780s, Firmin Didot (French) and Giambattista Bodoni (Italian) created modern serifs with contrast. The first slab serif typeface was designed by Vincent Figgins in 1815. The first sans serif--"Two Lines English Egyptian" or "Caslon Egyptian" appeared in 1816, designed by William Caslon IV.
​
Fredrick Goudy, the first full-time typeface designer, who started in the 1920s, created fonts such as Copperplate Gothic and Goudy Old Style, the latter based on the original Old Style.
​
![baskerville typeface pic.jpg](https://static.wixstatic.com/media/c2c472_bcba8be5caf04033b7f83ffbe239ebdb~mv2.jpg/v1/fill/w_446,h_264,al_c,q_80,usm_0.66_1.00_0.01,enc_avif,quality_auto/baskerville%20typeface%20pic.jpg)
The most iconic typeface of the 20th century is Helvetica, designed by Max Miedinger in 1957. The first digital typeface (Digi Grotesk) was designed by Rudolf Hell in 1968. In 1974, the first vector typefaces were developed.
​
In 2009, the Web Open Font Format (WOFF) was developed and added to the W3C open web standard. In 2016, variable fonts within the OpenType standard was introduced.
​
Typography has evolved a lot, from some of the original typefaces to the staggering amount of typefaces there are today (1,079,928 according to whatfontis.com), and it will continue to evolve adapting to our needs.
From WSC: One clue to your whenabouts might be the text around you: not just the headlines on newspapers and store signs, but the fonts they’re printed in. Consider some of the history of typography, then discuss with your team: how different would the world look today if Microsoft had chosen Comic Sans instead of Calibri as its default typeface in the early 2000s—or as its successor 20 years later.
![aptos](https://static.wixstatic.com/media/c2c472_22fd0411634b43a39112bd983d218b47~mv2.png/v1/fill/w_203,h_290,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/aptos.png)
In 2023, Microsoft replaced the default font it used for 15 years, Calibri, with Aptos. Aptos was originally named Bierstadt, the name of a mountain in Colorado where the font's designer (Steve Matteson) lives. Bierstadt is German for "beer city". Microsoft decided to rename it to Aptos as "people didn't take it seriously." Users can also choose to set other fonts as the default, including older standards. Matteson was asked to design a font in the grotesque sans-serif style, the company not letting on that it was to be the new default font. Even though Matteson was still working for the font company Monotype then, he and his colleagues provided some options to Microsoft, not including the names of the contributors as they didn't want their connections to influence Microsoft's decision.
​
Matteson had helped with the TrueType fonts for Windows 3.1 in the 1990s and created the Segoe font on Microsoft's current logo, so his history with the software giant goes way back.
Aptos is the name of a coastal town in Santa Cruz, California, and Matteson thought of it as there's diversity in California, from the beaches to the redwood forests, and he wanted to convey that there's "different voices [one] could speak in without distorting the message."
​
Matteson created different types of the font, from a serif version to a monospace one (used for coding). He also worked with Microsoft to ensure the best quality in different scenarios, for example when using Aptos in an Excel document, the numbers are less likely to overflow onto the next block. People has reflected that it is easier to differ a capital "I" from an "l" in Aptos. However, Matteson respects Calibri's creator, Lucas de Groot, and said he sees nothing wrong with the font.
In 1913, calligrapher Edward Johnston was commissioned by Frank Pick, the managing director of London Transport to design a font in order to make the lettering on the transportation more uniform. From this sprung the original font London Underground used, released in 1916, and called Johnston Sans. In the 1970s, designer Eicchi Kono updated it in order to adapt it to new technology (for example turning periods to diamonds). It has since been adapted to create the new lettering “Johnston100”, introduced in 2016. Transport for London supported the new lettering, claiming it “contains subtle changes to make fit for purpose in the 21st century. An example is the
![mind the gap.webp](https://static.wixstatic.com/media/c2c472_46ebd0747ddb44888aac6977a8834491~mv2.webp/v1/fill/w_461,h_259,al_c,q_80,usm_0.66_1.00_0.01,enc_avif,quality_auto/mind%20the%20gap.webp)
![London Underground sign g.jpg](https://static.wixstatic.com/media/c2c472_c206b7f2e75242ac8982ec55b0926a6d~mv2.jpg/v1/fill/w_93,h_106,al_c,q_80,usm_0.66_1.00_0.01,enc_avif,quality_auto/London%20Underground%20sign%20g.jpg)
colour being changed to blue on the “Mind the gap” sign. While it might not seem like too significant of a change from a distance, while overlaying the two signs, there is some differences in the style of the font, the greatest change being in the letter “g”. Head of TfL design Jon Hunter said updating the typeface was an “important step forward” in the age of technology, and he hopes this font for London Underground will last another 100 years and beyond.
A serif is a decorative line or taper (often referred to as tails or feet) added to the beginning or end of the strokes on a letter. Sans (Latin for “without”) serif fonts are made up of simple lines, all the strokes in a letter being the same width. Serif fonts are more professional and elegant, while Sans serif fonts are casual in contrast. Sans serif is better for the screen, as some screens cannot show the details of the Serif fonts. The Serif fonts are more commonly used for printing, because they are elegant and formal, making a good choice for businesses.
![image.png](https://static.wixstatic.com/media/c2c472_688dcb68e76747039688e2a27ee34509~mv2.png/v1/fill/w_328,h_218,al_c,lg_1,q_85,enc_avif,quality_auto/c2c472_688dcb68e76747039688e2a27ee34509~mv2.png)
In 2023, the US State Department started changing their official font style for documents. The original font was Times New Roman, the popular font that has kept its throne as the official font for US Government documents for more than 20 years. It was meant to help employees who was visually impaired, and was suggested by the secretary’s office of diversity and inclusion. This change, however, has lead to
​
complaints about inconvenience and that Calibri is not aesthetically pleasing. Officials within the department have said they feels the change leads to heated discussions and possibly internal revolts. However, this criticism is not the first in the State Department font history. In 2004, the font Courier New 12 (typewriter font) was replaced by Times New Roman as the official font, and the department received lots of criticism at the time.
​
I feel like governments should have one or two fonts that are widespread throughout the ranks, but they should not force implement a new font that some people may be uncomfortable with. What about you? Discuss with your team: Do you believe governments should have standardised fonts? How should they pick them, and when should they deem the time suitable for yet another change?
Discuss with your team: should some fonts be reserved for exclusive use by AIs and others for humans?
![Screenshot 2024-02-21 at 1.08.01 pm.png](https://static.wixstatic.com/media/c2c472_f79c8dfdf4174dfd939eb6b53f44e4fe~mv2.png/v1/fill/w_515,h_419,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%202024-02-21%20at%201_08_01%20pm.png)
Thomas Phinney is an undergraduate in psychology and political science at the University of Alberta, Canada. He also earned a master's degree at the Rochester Institute of Technology on graphic arts publishing, specialising in design and typography, proceeding to work for 11 years in Adobe Systems in Silicon Valley. One time, his group received a request about suspected forgery of a will, which case Phinney called “The Case of the Wicked Will”. Phinney found, using a digital microscope, that there was speckles of ink around each letter and traces of bleeding of ink in the fibres. He concluded the “will” had been printed on an early inkjet printer at 300 dots per inch (dpi). However, that type of printer didn’t exist in 1983, the year the will was supposed to have been made.
Phinney is like the Angela Gallop (top forensic scientist) of typography. He has been a witness for plenty of court cases and has evaluated documents for the American Treasury
and numerous TV shows and newspapers. He also consults for companies such as Microsoft or Google.
Type design is a craft that blends art and science, Phinney says. “While true innovation is rare, people consistently come up with variations on existing themes or combine existing elements in new ways.” He recommends the sciencegothic.com site, where users can change the different elements of a font, while staying within the Science Gothic family. In the past, this was a feat only achieved by using more than 200 fonts, and Phinney claims there is much more things one could do with fonts.
​
How did Phinney launch his career as a typography detective? Well, he keeps getting involved in cases, such as one time he discovered a rabbi trying to secure a job using graduation documents. The faxed paper (which the rabbi attempted to use in order to make it look more real) was dated 1968, but the font in which the man's name was written didn't exist until 1992, thus ending "The Case of the Reprehensible Rabbi. In 2018, he decided to formalise his hobby as a career, and just two years after, half his revenue was being generated from this job, the rest coming from designing fonts.
​
According to Phinney, most forensic cases are categorised as the "nefarious" type, like in one case (The Case of the Dastardly Divorce), a man tried to prevent his wife from getting the part of the money she should've got when they divorced by counterfeiting debt documents. The other type of cases was establishing whether a type of document meet an official standard, an example being whether the 5 point typography on Justin Timberlake's CD liners were adequate to count as a copyright warning (Phinney termed it "ridiculously tiny"). However, staying within this typography rule can be difficult, Phinney admits, as these rules differ in every state. California requires at least a 12-point font while New York has requirements different to those of any other state.
​
To Phinney, a perfect case is one exposing problems which can influence a lot of people. What about you? What do you think the perfect case for a font detective is?
​
Discuss with your team: what other technologies do we take for granted when we’re at stores or shopping online?
Self-checkout, the checkout way invented in the 1980s as a way to cut costs on labor, has been rethought about during the past few years. Retailers have found that self-checkout leads to "shrink", higher money loss due to intentional shoplifting and customer errors. Research in the US, UK and European countries shows that companies with self-checkout functions had a loss rate of around 4%, which is more than double the industry average.
​
One of the problems self-checkouts has is not being "smart" enough. Sometimes customers might have to choose between multiple barcodes on a product, or sometimes they type in the wrong code for vegetables or meat by accident. Then there's the problem of the item not being scanned successfully, and a staff has to go check it out. Other customers steal things, taking advantage of the self-checkout area with less staff. Common tactics include not scanning an item, swapping a cheaper item (bananas) for a more expensive one (steak), scanning counterfeit barcodes attached to their wrists or properly scanning everything and then walking out without paying. Some stores has implemented security measures, such as adding weight sensors. However, it has led to more problems, such as employees needing to log on and solve the problem if the sensor malfunctions.
​
Businesses such as Walmart, ShopRite, and Wegmans has ended some self-checkout ways in smaller stores or apps, and Costco announced they will be implementing more staff at the self-checkout booths.
Just after 8am on June 26, 1974, the first product marked with the Universal Product Code (UPC) was scanned in Troy, Ohio's Marsh Supermarket. The "customer" was Clyde Dawson, head of research and development for Marsh Supermarket; the pioneer cashier who "served" him, Sharon Buchanan. There's a legend that Dawson pulled out a multi-pack of Wrigley’s Juicy Fruit chewing gum as no one was sure a bar code could be printed on something as small as a pack of gum.
...Let's start over, shall we?
The "inventor" of bar codes, Joe Woodland, said he got the inspiration while sitting on Miami Beach. He was keen on inventing something that would make shopping easier after one of his schoolmates (Bernard "Bob" Silver) informed him of a conversation between a distraught supermarket manager and dean at Drexel Institute of Technology in Philadelphia (Woodland's old school). He then left grad school in the winter of 1948 and lived in a Miami Beach apartment under his grandpa's name. However, his brilliant idea didn't formulate until January 1949, and even then, the world did not recognize the potentials behind it.
​
Woodland had learned Morse Code in the Boy Scouts, and he was thinking of it that particular day sitting on the beach. "I remember I was thinking about dots and dashes when I poked my four fingers into the sand and, for whatever reason—I didn’t know—I pulled my hand toward me and I had four lines. I said ‘Golly! Now I have four lines and they could be wide lines and narrow lines, instead of dots and dashes. Now I have a better chance of finding the doggone thing.’ Then, only seconds later, I took my four fingers—they were still in the sand—and I swept them round into a circle." Apparently, Woodland just missed the modern bar code by about that much. However, we must give him a star for effort, as the idea back then was to make the code scannable from all angles. In 1949, Woodland and Silver filed a patent, trying to get the bar code system up and running. It was finally approved in 1952, but even then, they lacked the minicomputer and the "laser" to read the bar code.
![image.png](https://static.wixstatic.com/media/c2c472_02c0e207cafc4b52ac58730efc27e944~mv2.png/v1/fill/w_644,h_483,al_c,lg_1,q_85,enc_avif,quality_auto/c2c472_02c0e207cafc4b52ac58730efc27e944~mv2.png)
Invention of lasers: To sum it up, Theodore Maiman discovered the laser (acronym for Light Amplification by Stimulated Emission of Radiation) and because Maiman claimed that the laser was so concentrated that if it was beamed from LA to San Francisco (pop quiz-name 2 other things off the top of your head that was in San Fran) it would spread only 100 feet and the fact that it was hot enough to cut through materials, the laser made the headlines on the Los Angeles Herald--"LA Man Discovers Science Fiction Death Ray."
​
A booklet produced in 1966 by the Kroger Company, which ran one of the largest supermarket chains in North America, signed off with a despairing wish for a better future (basically QR codes, ikr it's the ultimate wish for supermarkets back then).
​
Other ideas before the QR codes involve when customers picked out punch cards that identified what they wanted to buy and presented them to a cashier, who retrieved the goods from a store. Unsurprisingly, it was quickly discarded. Then there was the patent for a system in which the supermarket shopper threw everything into a basket, which was pushed under a scanner that identified each item and printed out a bill (personally I think that one is a tad similar to the Amazon Go and self-checkout idea).
There was a lot of disagreements and compromises had to be made. After 4 years, bar codes (bull's eye ones) was finally ready to be introduced. In the end, seven companies, all of them based in the United States, submitted systems to the Symbol Committee, a technical offshoot of the Ad Hoc Committee. RCA, having demonstrated to the committee its system in Cincinnati, took the view, not unreasonably, that it was the only real contender. International Business Machines (IBM) decided to also enter the competition, considering that they had Joe Woodland as a staff member. However, the final IBM submission was created by George Laurer, who when facing the regulating restrictions, decided to create a rectangular bar code. The UPC has come a long way, and in 2004, Fortune magazine estimated that the bar code was used by 80 to 90 percent of the top 500 companies in the United States. Even former president George W. Bush was photographed at a national grocery convention looking intently at a supermarket scanner and having a go at swiping a can with a bar code over it (although he was attacked by the media claiming that he was out of touch with modern technological advances).
Just as barcodes transformed checkout, QR codes have changed many other everyday experiences, from debate tree distribution (sometimes) to accessing restaurant menus. But a change that seemed inevitable during the pandemic has run into resistance since. Discuss with your team: is this pushback a classic example of society resisting technological progress, only to eventually succumb? Are there any technologies that were supposed to change the world which were rejected and stayed rejected?
Amazon has released a new kind of technology on Amazon Go and Fresh stores in the US and London (UK), and these stores have a unique feature—one could just take the items they want and walk out. How does this work? Well, customers make use of an app called Amazon Go, installed on smartphones. They scan the app at the entry gate, and the products they take are automatically monitored by cameras and weight sensors in the shop, and as customers walk out, their account is automatically charged for the products. Recipts can be emailed to customers. Shoppers can also revert to the traditional method of shopping—they just have to inform the staff at the mechanical gate.
![Amazon Go.png](https://static.wixstatic.com/media/c2c472_73b92fab37764091aad8c48cec78fa9d~mv2.png/v1/fill/w_396,h_264,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Amazon%20Go.png)
Other payment options (they vary at different stores)
-
Amazon app
-
Credit card (just insert it at the gate)
-
Amazon One (scanning the palm to pay)
-
Other—Cash/SNAP/EBT
According to Amazon’s official website, anyone can shop at Amazon Go, without Amazon Prime or even an Amazon account. Most items can be returned within 30 days of delivery. Customers get a return code online, then when they return the actual item, they get a $2 off $5 or more items reward.
Anne Thériault complained on The Walrus in January (2024): Please Don’t Make Me Use Another QR code Restaurant Menu, the headlines of her article. She described that during her recent trip to Italy, the menu was pleasing. She felt like a “character in a book”, dining on a rooftop patio in Florence, under the Tuscan sun, reading the menu. Thériault was there during the “white truffle season” (September to December), and she points out that to people like her who prefers a character’s grocery list instead of reading battle scenes, a well-written menu can feel like “a piece of literature in and of itself.” However, while she was dining in a North American restaurant, she had to scan the QR code for the menu, and even though the restaurant was nice, the QR code menu was a downer.
QR codes (short for quick response codes), have been around since 1994 invented by Japanese company Denso Wave (employee Masahiro Hara came up with the idea while playing “Go”), but weren’t widespread to other regions of the world until the 2000s. During the pandemic, many restaurants converted to using QR codes, and most restaurants decided to keep that change, with more converting their own menus to QR codes. Hospitality Technology’s 2022 Restaurant Technology Study reported that 66 percent of restaurants in the US used QR code menus, and 19 percent of restaurants planned on adding them. According to Toronto Star, who quoted a research by the Dalhousie University—three out of every five Canadians used QR codes at restaurants or grocery stores in August 2021.
“My complaints with QR code menus are minor but many. I love the communal aspect of dining out with friends or family, and I hate the way that QR code menus take me out of the shared moment and force me to look at my phone (which, of course, leads me down the rabbit hole of checking my various notifications). I hate the way QR code menus mean scrolling, pinching to adjust size, and sometimes juggling between multiple tabs instead of just having to glance over a page or two.” Says Thériault.
Arguments supporting Thériault’s case:
-
Menus can also be historical documents (The New York Public Library has approximately 45,000 menus dating from the 1840s, which they’re digitalising through their “What’s the Menu” project)
-
Privacy—QR codes track consumer activity
-
Limits potential customers to those with some kind of electronic device (phones or iPads or something that can scan the code and pay)
-
Social—as Bloomberg reported in 2021—technology that promotes contactless dining has already been linked to job losses in the service industry
Sympathisers:
-
Conor Friedersdorf, polemic against QR codes for The Atlantic back in 2022
-
New York Times, published an article on QR codes’ alleged demise in 2023
-
X (formerly known as Twitter), brief search for “QR code menus leads to widespread reaction (albeit both ways)—one of Thériault’s favourites is from user @mlokeshceo, “Menu > QR code.”
Writer and cultural historian L. Sasha Gora, who leads a research group at the University of Augsberg, is exploring something called “culinary extinction”, which is the extinction of an ingredient used to make some dishes because humans liked to eat them, thus hunted them. An example is the Passenger Pigeon (scholars from last year should remember this), which died out from overhunting. As their numbers declined, Passenger Pigeon pie was still on the menu (it might actually be authentic or a fake pie made of more common pigeons).
“Ladies’ menu” is a type of menu printed without prices so women wouldn’t know how much their dates were spending on them. After a Californian woman threatened a restaurant with a discrimination lawsuit, the practice mostly disappeared.
Aaaaaand we're done, scholars! (Atl for this first bit that is...)
1 down, 16 more to go! (I know, I'm just so inspiring right)
Anyways, cya until the next chapter (be sure to do quizzes!)