ARTICLES| bespoke web | design | editor | works
readings | SNIPPETS| technology | toronto
biochem | cancer | canna | n n | net | other | plants | politics | social medicine | TECH| truths | pollution | biology
01 | 02 | 03 | 04 | 05 | 06 | 07 | 08 | 09 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 |22| 23 | 24

Barry Shein
13 March 2018
Does anyone under 50 years old still use ditto marks?
Unicode has the 〃 assigned (U+3003).
According to wikipedia they go back almost to 1,000BC.
But I don't think they survived computing.
Ditto mark - Wikipedia
؋ ​₳ ​ ฿ ​₿ ​ ₵ ​ ​₡ ​₢ ​ $ ​₫ ​₯ ​֏ ​ ₠ ​ ​ ​₣ ​ ₲ ​ ₴ ​ ₭ ​ ₺ ​₾ ​ ₼ ​ℳ ​₥ ​ ₦ ​ ₧ ​₱ ​₰ ​ ​ 元 圆 圓 ​﷼ ​៛ ​₽ ​₹ ₨ ​ ₪ ​ ৳ ​₸ ​₮ ​ ₩ ​ ..... 11
Barry Shein Hmm, that wikipedia page says U+3003 should be avoided except in CJK (Chinese, Japanese, Korean) contexts because most (Western?) languages use whatever is the right-leaning double quote. I wonder why they group Korean with Chinese and Japanese. The latter two are ideographic, no phonetic information*, Korean is just a phonetic character set like Latin, Greek, Russian, etc. * Oh there may be some hints but by and large.
Richard Sexton Do you know anyone that's ever used them?
Barry Shein I use them all the time when hand writing. 2
Lindsay Marshall I do as well 1
Richard Sexton you mean you write a tick mark to mean ditto? you use a quote mark though don't tell me you write a unique backquote diacritic?
Barry Shein By hand I just write a double-quote under whatever is to be repeated. I was mostly wondering if (younger) people use ditto marks at all, hand writing in particular, or is the concept fading? 1
Richard Sexton well, if nobody ever looks at a book before 2010 I suppose, but they show up so often in (at least scientific) literature and tallies and accounts ledgers aht I can't imagine it going extinct.
I use it all the time but didn't know there was a special character for it (and I actually jsut recently had occasion to look at all of them for a while, that's a fun rabbit hole. R
diacritics r us
Frank Wales I also use them in writing. Oh, wait, you said *under* 50...

Bob Frankston Remember we're still living in the legacy of the typewriter -- be thankful ! ! isn't `^h.
Hide 13 replies
Barry Shein It's interesting to think that the one critical development in the history of modern technology was the invention of the typewriter nearly 200 years ago (first patent 1829.)
Even in the 1970s a very popular computer input device (IBM2741) was just a Selectric typewriter (very popular in offices for general typing work) adapted for communication with their computers.
For many years BU's main dial-up number for students etc was 617-353-2741 (or maybe it was -2740 which would be IBM's designation for the entire 274x line), people probably wondered why such a peculiar number when they probably could have used -9999 or whatever.
Well...there's the rest of the story! No doubt someone's idea of a joke since I don't think one could reasonably hook up a modem to a 274x or own one at home. But they did have rows of 2741s in the student "pit" in the computing center. IBM 2741 - Wikipedia EN.WIKIPEDIA.ORG
Barry Shein P.S. Those 2741s gave rise to many "dancing selectric balls" programs. You'd write software which would cause the ball (it used a ball with the letters on it) to swoop and gyrate to some tune and play music (probably from a cassette) to match.
Bob Frankston IBM would come to my house to tune my 2741. I typed to fast and often the shift was lost so that the it would type in one case but the computer thought it was in the other. Caused some interesting problems including leading to a rule on Multics that ca See more
Richard Sexton X-Posted-To: Comp.fonts
I disagree I think IBM set back the evolution of typography immensely.
Look at the EBCDIC character set. Now look at ASCII. Other than a-z, A-Z and 0-9 you get a handful of characters. About 30.
Because this set existed because of word size in computer at some point we made a new table that has what the world thinks is the minimal set to write what they need. This is about 45,000 characters now.
So no I don't think IBM helped, quote the opposite, by constraining the expression of character to a homeopathic subset. Lets face it IBM had become so lazy with 360 junk by then that they were dragged into dot matrix and bitmapped screens the sort of things they're nominally on the leading edge of.
It didn't merely wrankle the aesthetic, we know from studies constraint on language yields poor brain development. a couple of teams looked at this phenomenon: ryps worldview were aksed what color an orange was and hoe many shades their language has for that color.
In every case the poeple that had the most words for the color orange were able to most accurately differentiate shades of the color. The less words they have the less shades. In the west we tend to think of light-orange orange and dark-orange which we also call brown although brown usually as more green in it than this). One group had no word for orange and referred to it either yellow or red about half each. They were not able to pick out an orange from a box of red, orange and yellow balls.
You wouldn't believe some of the characters that are in the html5 character table. It's pretty freaky.
Bob Frankston Be careful about taking Wharfism too far. As to HTML5/Unicode - it's a lot more complicated because mixing typography with character sets gets very complicated.
Joel B Levin I really hated the 2741. We were living in pure ASCII land (TTY and compatibles mainly) and wrote the software to add terminal software to Arpanet packet switches (i.e. to convert IMPs into TIPs). Sending ASCII characters down a terminal line was eas See more
Barry Shein IBM was about batch data processing. That the data and programs had to be entered and developed somehow was often answered with a shrug, you pay those people, right?
Bob Frankston Some of us (including Lynn Wheeler) saw a different IBM.
Lynn Wheeler CP67 was installed at univ. 360/67 in Jan1968 it leveraged terminal controller SAD command to switch port scanner type for automagic terminal type identification (1052 & 2741). The univ. had bunch of TTY ... so I added TTY/ASCII support extending the automagic terminal type identification. I then wanted to have single dial-in number (single "hunt group") ... but didn't quite work, while it was possible switch type of port scanner for each port ... the line speed was hardwired. This was part of univ. motivation to start clone controller project, build channel interface board for Interdata/3, emulate mainframe terminal controller, but adding dynamic line speed determination. This was extended to Interdata/4 for channel interface and cluster of Interdata/3 for port scanners. Interdata then started marketing the box, and continued to be marketing under PE logo after PE bought interdata. Four of us get written up for being responsible for (some part of) mainframe clone controller business.
Early clone controller "bug" was finding that IBM controller convention stuffed leading bit (in character) in low order bit position ... so when bytes were transferred to mainframe memory, each byte was bit-reversed order and had to be translated reverse. Also the selectric terminals used tilt/rotate encoding .... before transmission, mainframe characters had to be converted to their tilt/rotate representation. To be compatible with the IBM controller convention ... also had to do bit-reverse bytes and then reverse them again before going on the line. Note UofMich did something similar for their 360/67 which they wrote MTS for ... but used DEC PDP for the controller. Other trivia, IBMer was major person behind ASCII standard and expected 360 to be announced as ASCII computer ... however they had to use a lot of older BCD unit record gear (card reader/punch, printers) and started out with EBCDIC compatibility,
biggest computer goof ever
I didn't get dialup 2741 at home until March 1970, I had it until summer of 1977when I got a 300baud CDI miniterm. I had that for a year or two and then got a 1200 baud (IBM) 3101 glass teletype ... and it was replaced when I got my own IBM/PC at home. By that time, IBM was using special 2400 baud encrypting modem for home & travel use. Some of the CTSS people
had gone to the 5th floor to do MULTICS
and others went to the IBM science center on the 4th floor and did virtual machines, online computing, internal network, computer performance, etc. As might be expected there was a little competition between the 4th & 5th flrs. I joined the science center after graduation and one of my hobbies was developing & supporting enhanced operating systems for internal datacenters. It wasn't fair to compare the total number of VM customers to number MULTICS installations or even the number of VM internal datacenters to MULTICS .... however I could compare the number of my "CSC/VM" internal datacenters ... which was more than the total number of all MULTICS installations (over its lifetime).

Lynn Wheeler The internal network was larger than ARPANET/Internet from just about the beginning until sometime mid-80s. I transferred to SJR in the 70s and in 1982, it was first to get CSNET gateway from UDEL.EDU ... some old email
email from UDEL CSNET about transition from ARPANET to TCP/IP
At the time of the great 1/1/1983 cut-over, ARPANET had about 100 IMP nodes and 255 connected hosts, while the internal network was rapidly approaching 1000 nodes. Old Post with list of corporate locations around the world that added one or more new nodes during 1983:
The internal network technology was also used for corporate sponsored university network (also larger than Internet for a time)
I had project I called HSDT that was working with T1 and faster speed links. Also working with the NSF director and was suppose to get $20M to interconnect the NSF supercomputer centers. Then congress cuts the budget, some other things happen and eventually NSF releases RFP (in part based on T1 links that we already had running). Internal politics prevent us from bidding and the NSF director tries to help by writing the company a letter (with support from other agencies), but that just makes the internal politics worse (as does comments that what we have running is at least 5yrs ahead of all bid responses).
Lynn Wheeler Note that GML was also invented at the science center in 1969 ("G" "M" "L" comes from first letter of inventors last name), after a decade it morphs into ISO standard SGML and after another decade it morphs into HTML at CERN. First webserver in the US was SLAC's (virtual machine base) vm370 (descendant of cp67)

Lynn Wheeler other trivia, 4th/5th flr rivalry, old 1979 email about some people from USAF data services (#71 on Multics site list) coming out for visit to talk about getting 20 (vm370) 4341s.
the visit was delayed until the following fall, by which time the order had grown to 210 (vm370) 4341s, followon email about finally a colonel, couple majors and some others from Air Force data services coming by (names redacted to protect the innocent)

Richard Sexton Barry Shein My first job was running the IBM 1130 that did the phototypesetting for the local newspaper. Voss iss mit "batch job" ? The hoops it had to run through to make those codes. It replaced the hot lead stuff.
Remember me, buy my shirts!
pop art