History of Writing               

Index:

Related Video(s):

A Brief History of the Code: The Big Bang of Literacy
Early Code History

 Written English: The First Millennium Bug
English Code History

Return to Index of Topics  -  Notes: 1) This page is a work in progress and does not yet comprehensively cover its topic or include all the COTC and web resources its topic deserves.  2) Bold is used to emphasize our [COTC] sense of importance and does not necessarily reflect gestures or tones of emphasis in the original source. This color indicates COTC edits for brevity or flow. See referenced original for exact quotes.


Share this:          

History of the Alphabet

Well, I think there are two histories when we talk about the history of the alphabet.  There’s the history of letterforms and how they came into being, and there’s a lot of myths and misunderstandings about that history that are very common. Finding sources that allowed me to see a broader base of that history and to have my own misunderstandings corrected was one of the really major experiences for me of this research. 

The other history is the history of ideas about the way we think about the alphabet, and all of the properties that we project onto these letters, whether for magical purposes or religious purposes or interpretive purposes.  So that’s another entire history, and I think that the history of literacy and the history of reading, and of spelling reform and of shorthand notation, and of phonetic systems, and all of these various variants on the alphabet are also a part of that history of ideas. 

There are really two parallel histories.  What’s interesting to me is how in the twentieth century those two histories have separated. More and more we have specialists who look at the history of the alphabet within the origins of writing systems in the ancient Middle East and in that place between the Egyptian and ancient Sumerian cultures. Those are extremely specialized scholars and archeologists. But, more and more we’ve lost the other history, which is the history of ideas about the alphabet. We tend, in the late twentieth and early twenty first century, to bracket out the idea that letters have a magical power or a mystical power. I think that’s a mistake, because I think it’s exactly at the intersection of these two things that the alphabet functions most effectively. 

If we go back to that history of the letterforms, and I talk about the myths and the misinformations, there are a number of really crucial points that I think of as high points, or jewels, within this research. One of those is the misunderstanding about the number of writing systems that have ever existed within human history.

Johanna Drucker, Professor of Media Studies and Director of the Interdisciplinary Program in Media Studies at the University of Virginia Author, The Alphabetic Labyrinth. Source: COTC Interview - http://www.childrenofthecode.org/interviews/drucker.htm#TheHistoryoftheAlphabet

Two Writing Systems

There are only two writing systems in existence today, Chinese characters and the alphabet. People often say, well what do you mean by that? There’s Arabic letters, there’s Indian scripts, there’s Ethiopic letters, there’s all of these various kinds of letterforms. What do you mean there’s only two writing systems?  Most people don’t understand that the alphabet is actually a synthesis of two early writing systems, Egyptian hieroglyphics and various forms of cuneiform.  Once the alphabet came into existence, those other forms went out of existence.  Not causally.  Not because of the alphabet, but due to various other cultural and historical transformations.  But, all the major writing systems that we use today either descend from the alphabet or Chinese writing.  

Johanna Drucker, Professor of Media Studies and Director of the Interdisciplinary Program in Media Studies at the University of Virginia; Author, The Alphabetic Labyrinth. Source: COTC Interview - http://www.childrenofthecode.org/interviews/drucker.htm#TwoWritingSystems

How Writing Systems Evolved

I think we can learn quite a bit about the neurological problem by looking at how writing systems themselves have evolved. In a social sense it’s evolving in response to changes that stuck or didn’t stick in these various systems.  The few times that language has turned into writing, become externalized in the world, almost all began with a kind of pictorial mode.  The problem with pictures is that, of course, they’re not like sound, not like language.  The advantage of pictures is that they’re much closer to what they represent, in some sense.  We can even look at some of these ancient writing systems and make some educated guesses and it turns out usually they’re wrong for interesting reasons. One of those reasons is that almost as soon as picture-like writing came about people began to use the pictures to also represent sounds, the name of things, a rebus, like you find in a coke bottle cap, in which you’re trying to figure out the meaning of something by using a picture to represent a sound also. 

Most of the pictorial writing systems very quickly took on sound iconism, that is sound likeness as well.  The word for that thing has a sound and I can use that sound element to put together with other sound elements. What happened was that kind of system got co-opted very quickly. Now in one sense you can say, well this is a disadvantage because the interpretation of the meaning might be easier if you could see the picture, if you got something about the picture of it right away, because vision is one of the ways we remember objects and relationship.  On the other hand, it’s broken up into bits and pieces, it’s separated.  And of course it’s quite distant from speech.  Speech has recoded this in a whole different way.  What seems to have happened in most of the world’s written forms, not all of them, but even those that still have a bit of pictorial nature to them, even have acquired this feature, and that is, that they have become representations of the speech stream itself, and only secondarily of something in the world.

What that tells us is that these have become adapted to us. That is, our constraints - the things that we do well automatically. They're not any longer adapted to what we do easily, but are not so flexible and facile as language, which has this kind of evolutionarily instantiated support system that makes it relatively effortless.  So, something that is more encoded than a picture is actually easier for communication in this regard.  It’s still a kluge, it’s still an external invention for which brains weren’t well designed. I think the evolution of writing systems is a hint to what the brain design is. They’re not perfect, they’re also satisfices.  There are also other factors that are influencing them but they are  evolving towards a better fit to what we do automatically and easily.   

Terrence Deacon, Professor of Biological Anthropology and Linguistics at the University of California-Berkeley. Author of The Symbolic Species: The Co-evolution of Language and the Brain. Source: COTC Interview - http://www.childrenofthecode.org/interviews/deacon.htm#HowWritingSystemsEvolved

Emergence of Writing

Dr. Terrence Deacon:  You might say the minimal size to produce language in the same way we would talk about the minimal conditions to produce writing in human societies.  I think they are similar kinds of problems.  What has to happen to produce writing as far as we can tell by looking at those examples that we’ve seen are that writing emerges in complex societies, with dense populations, with what amount to city-like organization in which large numbers of people have to be organized and controlled.  Often times trade seems to be involved in this.    

David Boulton: Almost all of the earliest artifacts of writing we have are trade related.

Dr. Terrence Deacon:  Yes, trade related.  Although in Central America it may not be quite so simple as that. It may be more around religious communication and ritual organization.  But it could well also be paying of something like taxes and tithes.

David Boulton:  Right. Wampum and what have you.

Dr. Terrence Deacon:  Right. What we see is there is not just a number of people, but in a sense, certain communicative demands that show up with numbers, with a coordination problem of exchanging things and keeping things organized and particularly maintaining the differentiation of roles in that population.  I think that when we go back to look at the origins of the first symbolic communication, and by this I mean something well before what we would recognize as language, but communicating things in the abstract, communicating things that are not necessarily in the here and now, but naming, discussing things and representing things that might be or were or could have been.  In that process one needs a social organization that’s already creating that need.  In some sense, the social organization must have been moving in that direction.       

Terrence Deacon, Professor of Biological Anthropology and Linguistics at the University of California-Berkeley. Author of The Symbolic Species: The Co-evolution of Language and the Brain. Source: COTC Interview - http://www.childrenofthecode.org/interviews/deacon.htm#EmergenceofWriting

Alphabet Comes into Being

Dr. Johanna Drucker:  As various kinds of social formations come into play, the role of writing comes to the fore. Certainly we see that in the way that the alphabet comes into being in the Mediterranean region within the 1700 BC period. Though writing systems exist in the ancient Middle East, in the 3000 to 2700 BC period, that’s when we see the emergence of hieroglyphics and cuneiform systems, the alphabet itself was formed out of trade route activity about a 1000 years later. There’s a wonderful bit of research by a British archeologist named Flinders Petrie, from the early twentieth century, in which he actually traced the movement of the various symbols and signs that come to constitute the alphabet through that region. He argues that they are simply a limited set of encoded elements that become agreed upon because they’re relatively simple, they’re easy to make, and they can be made in a lot of different materials. They function well enough to be traded in between different language systems and different cultural systems. He really sees the alphabet coming about partly because of trade, mercantile reasons, and other functions within that particular domain.   

David Boulton:  Another theory is acrophonics - the notion that we made the first sound in a word's pictograph the sound value for the pictograph as a letter. There’s the articulatory theory of Robin Allot and others where it is expressed that there seems to be a resemblance between the shape of these letters and something going on when you look at the profile of the human mouth as it’s articulating them. Can you speak to that? 

Dr. Johanna Drucker:  I think what you’re asking about leads directly into the interpretation of the letters as visual symbols. Certainly what we do know is that the letters, that the names the letters of the alphabet have within the Hebrew naming system, aleph and beth and gimel, those are all names of objects that are common objects within a nomadic desert culture. You could look around the camp of Semitic tribes and you would see every item that is named within that alphabet system. Of course it makes sense; these are common objects. What are you going to use if you’re going to come up with a familiar system to remember what the names of these characters are? 

From that however, retrospectively what happens is that those names like aleph the ox come to be projected back on to the letterforms so that, and this is very much an invention of nineteenth century historians, you start to see in the A, the shape of the ox. Now there are no pictorial antecedents that are actually oxen that are the origin of that A. There are schematic forms that could be called an ox because they are some kind of circle, or have some kind of horns or that there’s some kind of B that has a square shape so we say that could be a house. But there’s no direct series of transformations where you can say, a picture of an ox becomes simplified into a line drawing and then becomes a little diagram of the shape that has horns and then turns upside down to become an A. 

So, that’s a fictive history. On the other hand, there are many ways that the letters of the alphabet have lent themselves to interpretation. The articulatory system is another, and there are wonderful diagrams of the mouth and the throat and the teeth and the tongue that will show you that A shows a certain configuration, B is the lips pressed together, and, again you can, schematasize almost any complex visual form into a set of stick figures that then can have other forms projected on to them. Is there a direct relationship? Probably not. 

One of the major movements for alphabet reform in the nineteenth century was led by Isaac Pitman and also by Alexander Bell, and these were systems in which the hope was that you could create a visual code that would almost be like an instruction set. That if you could make a little sign that showed you where to put your lips, your teeth, your tongue, so that you could say A properly, and where to put all the organs of speech so that you could say B properly, that you would be able to create a self reading alphabet. This is a great idea but, it turns out that learning that code is extremely complicated. 

David Boulton:  I noticed that you said Alexander, talking about Alexander Melville Bell, not Alexander Graham Bell.  

Dr. Johanna Drucker:  Right.
 

Johanna Drucker, Professor of Media Studies and Director of the Interdisciplinary Program in Media Studies at the University of Virginia; Author The Alphabetic Labyrinth. Source: COTC Interview - http://www.childrenofthecode.org/interviews/drucker.htm#TheAlphabetComesintoBeing

Early Alphabet

It does seem as though around 1700 BC in the Sinai peninsula we see evidence of what is the earliest sort of form of what comes to be the alphabet, and that’s the Proto-Canaanite alphabet. Some of that alphabet shows up in turquoise mines in areas where Hebrew speaking persons and Jews who were coming out of Egypt were working in these areas. But, it is an area of cultural mix, and what the alphabet takes from the areas around the Tigris and Euphrates and the whole sort of Sumerian civilization is a syllabic approach.

In other words, the idea that what you are doing is actually representing syllables comes out of the Sumerian use of cuneiform. Whereas the Egyptian pictographs, the hieroglyphs, have been simplified, as we know, there are three forms of writing within the Egyptian system. There’s the sort of very formal hieroglyphics; there’s a script form, which is hieratic; and there’s a demotic script.  So there are three different forms within the hieroglyphic system. 

Some of the early alphabetic signs can be traced by the relationship between the name of the sign and the sound that it represents to the Egyptian point of origin.  But, they also can be traced back to these Sumerian points of origin. So it seems like we have a cultural mix here. One of the most interesting things, I think, is that the sequence of letters in the alphabet is fixed in that period in 1700 BC. Now, it was a short alphabet at that point; it’s much shorter than our current alphabet, but that sequence of signs, the A, the B, the C, the D (at that point not called that, and they don’t quite resemble our contemporary letter forms), that sequence is fixed and used for the assembly of architectural structures. 

It’s actually used the same way that we would use it in a little instruction book that would come now, you know, with the night before Christmas when you’re trying to put someone’s bike together and it says, part A, part B, part C.  So, as a sequencing device the alphabet has been extremely useful. It’s by that fixed sequence that we can also trace the development and diffusion of different offshoots of the alphabet.  

I have a wonderful quote here, actually, about the Moses connection. I’ll just read it.

In Exodus it says, “I will give thee tables of stone, a law and commandants which I have written.”  Who is I?  Who is speaking in that?  You know, “I will give thee tables of stone, a law and commandants which I have written.”  That’s the voice of God.  The tables were the work of God and the writing was the writing of God.  And, there are people who say, I mean, within the various interpretive traditions, there are those who say that, the first writing was the table of the Ten Commandants; it was those tablets that Moses went up on Mount Sinai and brought down.

Johanna Drucker, Professor of Media Studies and Director of the Interdisciplinary Program in Media Studies at the University of Virginia; Author, The Alphabetic Labyrinth. Source: COTC Interview - http://www.childrenofthecode.org/interviews/drucker.htm#EarlyAlphabet

Back to the Beginning - 3000 BC

The Western biblical tradition, after all, is still very much with us in terms of our secular lives. Are these not the weeks in which the Ten Commandment tablets are being contested within a place of public justice in the United States? And the question of the division of church and state and what is the legacy of the Ten Commandments to our codes of law? We forget the code of Hammurabi, which is another one of the great codes within the Judeo-Christian Western tradition, as one of the things that underlies a lot of the law codes that we come to use in contemporary culture. 

But yes, the coincidence of the development of writing systems within that particular period is really interesting. We don’t have evidence of writing systems that are much older than 3000 BC. There are signs; there are marks, the famous Mas d'Azil stones, rock carvings and other forms of inscription. It seems clear that one of the fundamental activities of human beings is to represent themselves to themselves through mark making; that we understand the world through representation. We want to present all of our experience in some symbolic form and we see a magic and potency in that representation system.  

...Different writing systems do have different functional purposes as they come into being.  It’s interesting that the Chinese, the first Chinese characters that are invented are the characters of the I Ching. And, so those have, again, an oracular power. They’re used for divination, and they’re used for the study and encoding of knowledge. And by knowledge is meant a moral knowledge, a spiritual knowledge, as much as a practical knowledge. So the I Ching characters are the oldest of the Chinese characters. 

Within the cuneiform tradition we know that the oldest forms that we have, at least, are ones that were used for business transactions. We’re pragmatic creatures. Now, the hieroglyphics, however, are not really so much instrumental in the business sense, they’re instrumental in the sense of public language:  monumental language, prayers, invocations, memorials, tributes, records of historical events.  

Johanna Drucker, Professor of Media Studies and Director of the Interdisciplinary Program in Media Studies at the University of Virginia; Author, The Alphabetic Labyrinth. Source: COTC Interview - http://www.childrenofthecode.org/interviews/drucker.htm#backtothebegin

Emergence of Writing in the West and East

In other words, 3000 to 2700 BC we see Egyptian hieroglyphics come into being. One of the interesting things about that is we see no precursors. They come into being almost fully formed. What does that mean? That’s an amazing thing and it’s a difficult thing to explain. Around the same period, 3000, 3100 to 2700 BC, we see cuneiform writings come into being. And those, again, separated the area of the ancient Middle East where also we know the most advanced civilizations are really going through rapid transformation in terms of the cultural institutions, the emergence of civic forms of government, whether it’s monarchical or legalistic, or whatever. 

The administration of public affairs as well as the administration of private affairs is something that’s well served by writing systems, whether you need transactions for records or whether you need the law as a point of public record against which deviation, difference, transgression can be measured. So, I think we shouldn’t underestimate the necessity for law, whether it’s created within a sacred or a secular realm.  

We also have the need for memory, for cultural memory. I think, again, the transformation to a history sensitive, record-keeping culture is significant in this period as well. It’s one of the things that scripture provides, a sense of human cultural memory and myths and tales. The tales of Gilgamesh, other writings from the ancient Middle East, the Book of the Dead from Egypt, these are ancient scripts that, again, are preserved through writing, and passed on. So that’s quite early. It’s 1700 B.C. when we begin to see the alphabet come into being. 

The point of origin for the Chinese writing system is somewhat debated, but I looked it up this week in advance of coming to see you, and 1200 B.C. seems to be about the earliest date that anyone is willing to put onto the I Ching. You could stretch it back a little bit, but when you think about it, it is a little bit later. I think a good historian of Stone Age, Paleolithic, and Iron Age cultures would be able to describe the conditions of technological capability as well as the sort of state of the culture that would allow the writing system to be able to come into being, come into circulation. 

I think that, in fact, if you look at it at a micro level, the range of periods of which these writing systems come into being stretches over almost 2000 years - which is not trivial when you think about it. 

I think it’s actually more broadly separated, perhaps, than we realize. What we do know is that there really wasn’t any direct cultural transmission. There were in the Renaissance, many scholars who tried to trace a common origin for Chinese characters and Egyptian hieroglyphics. Again, there was some profound conviction that these two things must be connected and that they must have a common origin.

There are a couple of other points that I think are important to make. One is that there were other starts, aborted starts, for writing; and there are several of them. There’s a script that comes into being in the Indus valley, and that’s also around 2700 BC.  There’s an indigenous Easter Island script that comes about but doesn’t go anywhere.  In the ancient New World we have Mayan script coming into being, and again, without any kind of contact.  I think it’s an argument for the idea that symbol making and written forms of mark making are things that do come about through human cultural evolution for whatever reason to serve certain kinds of purposes. It’s not something that has one point of origin and diffuses. So, I think that’s also really interesting to think about. It argues for a cultural purpose. What that cultural purpose is, is again, hotly debated.  Those arguments really map very well onto different moments in history. 

For instance, we think about what happens in the early eighteenth century with someone like Voltaire or someone like Rousseau. Rousseau is a better example.  Rousseau imagines that the invention of language comes about through the need for humans to express their passions. Then you have Besserat in the twentieth century saying, ‘Guess what, folks, all of these cuneiform tablets that you thought were mystical, magical and so forth? They’re filled with transaction reports and these messages from husbands and wives who are split over distance communicating about the business, and meanwhile passing on family news’.

The cuneiform tablets actually contain messages that say things like, when you open this be sure that whoever brought it to you also brought you three cows, two goats, seven sacks of grain, and by the way, the kids are doing fine except that they’re quarrelsome, as usual; and the barnyard has been overrun by hens this year.

Again, it turns out that the nature of human communication is much more universal across five millennia than not. Therefore, we see that writing systems, even as early as 2000 B.C., are being used to communicate the same kinds of things that go in email today.    

Johanna Drucker, Professor of Media Studies and Director of the Interdisciplinary Program in Media Studies at the University of Virginia; Author, The Alphabetic Labyrinth. Source: COTC Interview - http://www.childrenofthecode.org/interviews/drucker.htm#TheEmergenceofWritingintheWest  

Historical Highlights 

David Boulton: Yes. One of the things that we're trying to show is that, as you've just indicated, there's this French influence on the language, there's the Chancery scribe story that John Fisher has brought forth, there's Caxton and the printing press, then there's the emergence of phonics in the 1600’s, and then Benjamin Franklin and Noah Webster go to work in the 1700’s.

Dr. Thomas Cable: Yes. You've kind of hit the high points. You peg your story to those, and you'll have something really interesting, I think. If you can start back with John Fisher, I don't have his articles here with me, but I've read them in the past and we've exchanged ideas and I think he said that when Henry V went to the continent in 1420, that he sent a letter back to his wife in English, and it kind of marks this as the beginning of English correspondence. There had been some English letters in the last few decades of the 14th century, but Henry V was a big deal.

Then his clerks of chancery — I think Fisher says there were a dozen masters of chancery who imposed a certain way of writing, which, again, as parallels with what the Spaniards were doing in the New World when they were trying to impose their alphabet on the Native American languages. There's so many directions that this goes. But just to stay with English, there is the competition between English and French, and the standardization of English, beginning with Henry V. I think Fisher has some very smart ideas on that.  

There are two factors in standardizing spelling, one is the invention of printing and its effects, and the other is the orthoepists and those who are trying to revise English orthography. I think they diddled around and didn't really accomplish much. William Bullokar had a system with a lot of hooks and diacritics, and that didn't catch on. Richard Mulcaster had a moderate system that might have had some influence. But I think even if they hadn't fretted about it, just this kind of impersonal inexorable force of the printing press would have standardized it.  

But there's always this anxiety. I think that's interesting, too, because you mentioned the personal anxiety of children learning to read. There have been periods when the English as a nation, and then after them, America and the colonies, have felt anxious about their language.

So there was the whole to-do about an academy, which reached its culmination in the early 18th century with Jonathan Swift, and then Queen Anne, and it didn't amount to anything. But the English would look across at the continent, at the Academie Francaise and the Accademia della Crusca in Italy, and see that there were these countries that had committees that were regularizing the language. The English just felt inferior.  

So there was an effort for a century or more to establish an academy mainly because people couldn't agree. Lord knows how the French ever agreed to put an academy together. But the English did not, and so finally that just kind of fizzled out in the early 18th century.

But by the middle of the century it was done — the job of the academy was done by individuals. Samuel Johnson published his dictionary in 1755. About that same time, the Grammarians were publishing their prescriptive grammar, saying how it ought to be. They were just cooking rules up off the top of their head, using kind of false analogies with Latin and spurious analogies with mathematics and logic. But once they wrote it down, it would get repeated from one grammar to the next.    

Thomas Cable, Professor of English, University of Texas-Austin, Author of A History of the English Language. Source: COTC Interview - http://www.childrenofthecode.org/interviews/cable.htm#HistoricalHighlights

The Emergence of Standard English

I am a Medievalist. I teach Old English and Middle English and Chaucer. I did a great deal of work in the Middle English period with the manuscripts. And what was very interesting was to see that after the Norman Conquest English became nothing but a ‘patois’ (a peasant language without a writing system.) Old English was pretty well standardized, but after 1066 all standardization disappeared and what you had was dialects. English was standardized by people who were trained to read and write in Latin. When they began to write down English they wrote it down with the equivalence of the Latin alphabet; the sounds of A and B and C as in Latin but they applied that to the pronunciation of their own language.

So, there was no English standard language from let’s say 1100 to 1400 and by the time you get to Chaucer you begin to have a little bit of standardization. This was very interesting to me. I taught for forty years at New York University and other places and what I became convinced of as I learned something about the appearance of standard language at the beginning of the Renaissance in France and in Spain and other places was that the standardization of the written languages always came 100-200 years before any motion towards standardizing spoken languages.

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee, Author of The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#Emergence

Beginnings of Standard English - 1400

Having decided that standardization must come from the written form of language I began to look much more closely in my manuscripts and my people in the 14th century. It suddenly came through to me something that no history of the language had pointed out before: that the standardization of English and manuscripts written in English, which grew rapidly more and more standard, came at the time of the Lancastrian Rebellion in England. We had dialects up until 1400. Right after 1400 we have the beginning of standardization and this is the time that Henry IV and then Henry V took over the power in England. If you read the history of the time what you find out is that these people wanted to gain the support of the common people in England because all histories of English Parliament say that Parliament begins about 1400. Up until that time it isn’t important. From 1400 on it becomes more and more important. 

If the rulers, the two Henrys, Henry IV and Henry V, wanted to get support from the common people they would have to change the language of Parliament from French to English.  

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee, Author of The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#beginnings

King Henry V - The Originator of Written English

John Fisher:  Another person that you might want to talk to about it is a person who wrote his dissertation with me who found out that Henry V had a direct connection with it. His name is Malcolm Richardson and I believe he is now at Louisiana State University. He is the man who found the documents that Henry V himself had written and I would maintain that since all these scribes worked for Henry V, that Henry V owned written language. If we had to point to one person whose language was the basis for modern standard English it would be Henry V.

David Boulton:  Did he read and write directly personally or did he have scribal interface?

John Fisher:  Oh no, Henry V could read and write himself. He read and wrote in English.

David Boulton:  So he would have been a major influencing factor over what was considered to be the right way to do this.

John Fisher:  I would say so.        

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee; Author, The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#henryv

French and Latin

John Fisher:  English continued to be the spoken language but everything written until 1400 was in French and in Latin. So you can see how it went from there. These two people trying to appeal to the common people began to use English for law, and for business, and for literature. They began to standardize it, to regularize it as a way of getting support from the common people who had been using French and Latin for these purposes up until that time.

David Boulton:  The common people had been using French and Latin or the intermediate middle class?

John Fisher:  Well, by common people I don’t mean peasants. I mean all of the function in England having to do with trade, having to do with law, having to do with literature, was all in French and Latin. The only things we have in English up until 1400 are patois’ (a clumsy peasant language without a standardized writing system.); you have myths and things like that that people took down from the spoken language. Any kind of business in London until 1400 had to be carried on in French or in Latin. There wasn’t a vocabulary to carry on (business) in English. Consequently, when they began to standardize the English language, fifty percent of the vocabulary came from French and Latin. All of the standard words had to be adopted from French and Latin where they had vocabulary for it because they didn’t have vocabulary for it in English.

David Boulton:  So the French and Latin languages were making distinctions that had become important that the English language hadn’t.

John Fisher:  The Romans had all of these things worked out 2000 years ago. They simply took the terminology from Latin which the church had brought into its business dealings and they simply used this and French. Until 1300 or 1400, people in France thought that Latin was simply the correct way to write French. They thought that French was derived from Latin. And so they wanted to use Latin as the language of record but they talked French. It wasn’t until well after 1300 that they began to write important documents in France in French. And the same thing for Spain.   

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee, Author of The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#frenchlatin

English Sounds Transcribed into Latin

John Fisher:  They all had to learn Latin in school because Latin was the living language. And they knew what the sounds of the Latin letters were. And so when they found the same sounds in the vernacular languages, English or French or Spanish, ‘ah’ was ‘ah’ in Latin and ‘ah’ became ‘A’ (‘ah’) in English. And what they did in a sense was transcribe the sounds of English and Spanish into the sounds that they learned in school for Latin.

David Boulton:  Yes, excellent. Now at this point in time, as I understand it, there was a much more close approximation in the correspondence between letters and sounds within the Latin system.

John Fisher:  Well, they didn’t pronounce Latin the way we do. That’s all been brought back in the 19th century to the sounds that we think Romans used. But they knew what their pronunciation of the letters was in Latin and so they simply made the same sound in French or in English with the letter A or the letter B or C. It’s what the missionaries do when they go to the Polynesian Islands. When they want to translate the Bible into Polynesian they don’t invent a new vocabulary. They simply make the Latin vocabulary that we’re using express the sounds they think they’re hearing in Polynesian languages. We’re doing this all the time. The Bible had hundreds of translations using Latin characters into African dialects, Pacific dialects, Asian dialects. They simply use the Latin alphabet but make the sounds the sounds of Swahili or something like that.  

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee; Author, The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#transcribed

Not Enough Latin Letters for English

David Boulton:  Up to this point when it starts to get standardized and irregardless of the fact that the sounds that they may have been using at that time are different than the way we might interpret them now, it seems that the Latin had a tighter correspondence in terms of the number of discrete and distinct sounds in the spoken language and how they mapped to the letters in the alphabet.

John Fisher:  Oh yes. You see the alphabet was made for Latin. It has never fit very well for English. In order to get the English sounds of ‘huh’ we don’t have a letter for ‘huh’. The Romans didn’t have a ‘huh’ sound and they didn’t have a letter for ‘huh’. So we have to create the combination GH to make ‘huh’. We make it through now. But we had to have SH, GH, and other combinations which they didn’t have to have in Latin because the Latin alphabet was conceived, created for the sounds in Latin. And we have sounds in English that they don’t have in Latin.       

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee; Author, The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#notenough

Alphabet Wasn't Created for English

David Boulton:  But today the struggle that our children face in learning to read is that there’s a lot of, at least as perceived by them, confusion and ambiguity in the relationship between letters and sounds.

John Fisher:  Because the alphabet wasn’t created for English.

David Boulton:  Right. And these scribes that were mapping the alphabet to the English spoken language weren’t very concerned about the ecology of processing the resulting code for the masses of humanity that would come later.

John Fisher:  Well see, everybody who knew how to read and write until about 1600 learned to read and write in Latin and then tried to apply that to English. Nobody learned to read and write in English until well after the Renaissance 

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee; Author, The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#alphabetwasnt

Nobody Cared About English

John Fisher:  Well, you see, until 1700 nobody cared about English. It was a patois (a clumsy peasant language without a writing system) and nobody cared about how these sounds were produced. It’s until you get Dr. Johntson, Priestley and the people who wrote the first grammar dictionary after 1700 that people began to think about the difficulties of writing English. Until that time it just grew like topseed. But it was standardized because all of the people in government, all of the people in law wanted to communicate. If they wrote the law in London they wanted people in Bristol to be able to understand what it was.

David Boulton:  So, it was the same logic as King James wanting to have a Bible that the people could read.

John Fisher:  That’s right. Exactly.

David Boulton:  So, in order to unite the people, in order to create a power base, in order to have what we might call a distributed conversation that’s possible in a larger consistency you’ve got to start with a common frame of reference which is the written language.  

John Fisher:  The people who created that King James language were not linguists. They didn’t think about sounds, they didn’t think about inflectional endings, things like that. They simply wanted a practical, useful means to communicate.       

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee; Author, The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#NobodyCaredAboutEnglish

Chancery Scribes

John Fisher:  Well, I think it might be useful for people to realize that the standardization that began under King Henry V was a purely practical operation. He had twenty-six scribes who wrote the royal documents. And in the Chancery at that time there were about 220 scribes who did practically all the important writing in England. I have some figures on this in the book and you might want to look at it sometime, but it was about 250 people under Henry V who began to use English for legal purposes, for philosophical purposes, for educational purposes.

It was a very, very small group of people and they all worked in the same offices and they all had the same kind of training and so they created a language which if you knew the scribal language of England you could use anywhere. We can look at the manuscripts in 1440-1450 and we can tell immediately if this person had been trained in the scribal school. They had their houses, scribal houses in London. They lived together. If he had been there for a couple of years you could read his writing. If he hadn’t been there for a couple of years it’s very hard to figure out what he’s saying.

David Boulton:  So they wrote for a very small elite.

John Fisher:  Oh yes. The group who could read and write were perhaps 5000 people in all of England. This is why we talk about the ‘ars dictaminis’ (art of dictation).  Most of the writing that was done until 1500 was done by dictation. There were scribes. You called the scribe in and told them what to say and he wrote it down for you. Then it went to the person who was supposed to receive it. That person called his scribe in, the scribe read it to him and so it was from the dictator to the auditor. We still use the terms dictator and auditor.

David Boulton:  Yes. It’s a former technology that actually models over to the way we encode and decode technological messages today.

John Fisher:  Exactly.

David Boulton:  The people that came in this scribal school at the time that this was happening, the people that would have been inside these scribal schools and doing the scribal work on both sides of this were people that had come up through Latin.

John Fisher:  They would all have learned to read in Latin. Nobody went to school to learn English.  

John Fisher, Medievalist, Retired Professor Emeritus of English at University of Tennessee; Author, The Emergence of Standard English. Source: COTC Interview - http://www.childrenofthecode.org/interviews/fisher.htm#chancery

A Sequence of Accidents 

Dr. Thomas Cable: So, there's a sequence of accidents beginning with the dialect that became Standard English, which was essentially London English. It was reinforced by Cambridge and Oxford, maybe it was reinforced by Chaucer, but we know that poets don't count for so much now, they probably didn't account for much more then as far as standardizing the language.

David Boulton: They created a market.

Dr. Thomas Cable: Yes, that was an accident, that it was London, the capitol of the country. Same thing happened in France. Paris was — that gave the French their dialect. Now, Italy had more of a problem, and so it took them a while to decide which dialect would be standard. But here in England and France you had London and Paris. So that was nothing that was God-given, and it was certainly nothing that a committee of cognitive scientists or reading experts came up with.

Once that dialect was established, the way it was spelled, as we just said, was fortuitous, almost whimsical.  

Thomas Cable, Professor of English, University of Texas-Austin; Author,  A History of the English Language. Source: COTC Interview - http://www.childrenofthecode.org/interviews/cable.htm#ASequenceofAccidents

Standardized Spelling

There are two factors in standardizing spelling, one is the invention of printing and its effects, and the other is the orthoepists and those who are trying to revise English orthography. I think they diddled around and didn't really accomplish much. William Bullokar had a system with a lot of hooks and diacritics, and that didn't catch on. Richard Mulcaster had a moderate system that might have had some influence. But I think even if they hadn't fretted about it, just this kind of impersonal inexorable force of the printing press would have standardized it.  

But there's always this anxiety. I think that's interesting, too, because you mentioned the personal anxiety of children learning to read. There have been periods when the English as a nation, and then after them, America and the colonies, have felt anxious about their language.

So, there was the whole to-do about an academy, which reached its culmination in the early 18th century with Jonathan Swift, and then Queen Anne, and it didn't amount to anything. But the English would look across at the continent, at the Academie Francaise and the Accademia della Crusca in Italy, and see that there were these countries that had committees that were regularizing the language. The English just felt inferior.  

So, there was an effort for a century or more to establish an academy mainly because people couldn't agree. Lord knows how the French ever agreed to put an academy together. But the English did not, and so finally that just kind of fizzled out in the early 18th century.

But by the middle of the century it was done — the job of the academy was done by individuals. Samuel Johnson published his dictionary in 1755. About that same time, the Grammarians were publishing their prescriptive grammar, saying how it ought to be. They were just cooking rules up off the top of their head, using kind of false analogies with Latin and spurious analogies with mathematics and logic. But once they wrote it down, it would get repeated from one grammar to the next."  

Thomas Cable, Professor of English, University of Texas-Austin; Author of A History of the English Language. Source: COTC Interview - http://www.childrenofthecode.org/interviews/cable.htm#HistoricalHighlights

Printers and Scribes

Caxton was a bookseller, and he was very much concerned about reaching the widest market, and so he thought about it. But he didn't think about it in the terms that we're talking about it in.

David Boulton:  Then, of course, when it originally formed, it sounds to me like the scribes, and certainly up to that point the language of power as you mentioned, and the language of science and learning was Latin. So if we're going to err in the way of stewarding or being careful with one of the languages, it wasn't going to be English.

Dr. Thomas Cable: Right.

David Boulton: English was for the other people.

Dr. Thomas Cable: Exactly. It wasn't taken seriously for literature or for science.

David Boulton: Right, it was just another thing to be able to connect up this other part of the population to build some constituency over time. But it wasn't the central compass for what was going on.

Dr. Thomas Cable: Right.

David Boulton: My sense is that it was: First preserve the Latin; second, the French, all of these influences were more foremost in the mind of the people that were originally involved in developing what would become our written language.

Dr. Thomas Cable: Yes, certainly in some of them, there were divisions, there were debates about this. So you get lines of debate, going back to, as I said, the latter part of the 16th century. Some people were — first of all, they had to defend English against those who wanted to keep Latin as the language of learning.

Now, Richard Mulcaster is an admirable person for his time, because not only did he defend English, but he tried to figure out some way to standardize the spelling. He was always moderate. But you never know how much effect a single person has in matters of language; certainly then, when there were fewer people speaking English, a single person could have more of an effect.

But sometimes the language just takes on a life of its own — I mean, usually the language takes on a life of its own. You were concerned that nobody was minding the store, or you are concerned, and rightly so, but I'm not sure if it would have made any difference.  

Thomas Cable, Professor of English, University of Texas-Austin; Author, A History of the English Language. Source: COTC Interview - http://www.childrenofthecode.org/interviews/cable.htm#PrintersandScribes

No One Was Minding the Store

David Boulton: And yet when we look at it, whether we go back to Henry V's letter to his wife, or the French interruption, or all these other little points that we've been touching on, what's really clear is that people just did what made sense to them at the time. We did not have cognitive scientists, psychologists and code theorists and...

Dr. Thomas Cable: I think that's an important point to make. In various ways, that's the point I make when I teach the history of the English language, that what has become international standard written English, so that you can read a newspaper from Sidney and a newspaper from London, and a newspaper from New Delhi, and from Boston, and you can go for paragraphs without really knowing what the country is that the paper was published in.

That kind of international English came about as a sequence of accidents. It really goes back to the variety — the first accident was the variety of English that became standard. So you can trace it certainly from the 14th century when London was the most important city.

If Lancaster had been as important as London, we'd have quite a different English now. But the East Midland dialect was the one that was the most populous. It was a kind of compromise between the extreme dialects of the north and the south. But there wasn't any committee that got together and said, "This is the way English ought to be."

David Boulton: No one was minding the store...

Dr. Thomas Cable: No one.

David Boulton: Nobody saying, "We're going to take responsibility for...”  

Dr. Thomas Cable: No minding the store at any point, even though, as we were just talking earlier, they tried to put some store minders in with an academy. But that didn't work. All of the self-appointed guardians I think had very little influence.

The standardization of spelling is often attributed to Dr. Johnson in the 1755 dictionary. But really, spelling had pretty much been standardized by 1650. That was, again, not because of what the orthoepists and the reformers were doing in the late 16th century, it was just the printing press.  

Thomas Cable, Professor of English, University of Texas-Austin; Author, A History of the English Language. Source: COTC Interview - http://www.childrenofthecode.org/interviews/cable.htm#NoOneWasMindingtheStore

Language Protectionism

Dr. Thomas Cable: There was a lot of debate about what words should be let into the language. Have you encountered that?

David Boulton: No.

Dr. Thomas Cable: This was the same time as the spelling reformers and there were heated debates. If you stop and think about it, there's a kind of ripple effect all the way up to William Safire and John Simon and others. They were concerned that too many Latin words were coming into the language. This was in essentially the 1580’s and they were also concerned about the Italianate Englishman, the guy who would go to Italy, and come back effecting — you see this in Shakespeare's plays — effecting Italian styles and using either — a continental language, either Italian or French, showing his savoir faire, if you like. And there was a reaction against this.

So there were several specific categories of vocabulary that these guys objected to. Inkhorn terms, which were those from Greek and Latin; overseas language, which were from Italian and French; and to some extent, Spanish; and Chaucerisms, or the old words that were being revised. So during the Renaissance, there was this anxiety about letting words into the language. But the words are going to come in willy-nilly. Even the French have found that out with their French academy.

Still into the 20th and 21st century, you hear people worrying about George Orwell's famous essay on Politics and the English Language about Latinate words. I don't mean to be condescending to that, because there is a point that if you use polysyllabic Latinate vocabulary, you can just sound very pretentious, and it turns into jargon. So there's a kind of middle path somewhere to hit. But this was a problem that went back to the 16th century — I'm talking too much, sorry.

David Boulton: You're doing great. Please continue, I'm really enjoying listening to you.  

Dr. Thomas Cable: But we can always see the parallels between the printing press then and the computer and the internet now, the anxiety over Latinate terminology then and the legitimate concern about jargon. George Orwell said, "Use this homely, simple monosyllabic native words, not the Latin vocabulary."

David Boulton: As you were saying that, the image that struck me was that it was very much like the problems of immigration or the problems of the melting pot of culture in general.

Dr. Thomas Cable: Yes, exactly.

David Boulton: Some people’s concern today about the loss of jobs that will go with spreading the economy around like this nationalistic conservative protectionism but happening at a language level.

Dr. Thomas Cable: Exactly.

David Boulton: The center of our memetics.

Dr. Thomas Cable: Exactly. That's true, yes.

David Boulton: That's fantastic. I really hadn't gotten on that dimension of conserving or insulating against the inbound influence of other words.

Dr. Thomas Cable: Let me find something and read it to you, if I could.

David Boulton: Please.

Dr. Thomas Cable: This is a letter from Sir John Cheke in 1561. He was a purist and he said — and this was published in Sir Thomas Hobys translation of the Courtier, 1561: "I am of this opinion, that our own tongue should be written clean and pure, unmixt and unmangled with borrowing of other tongues. Wherein, if we take not heed by time ever borrowing and never paying" — see, it's this metaphor that we've got of borrowing words as if some — we're running up this debt — "she shall be feign to keep her house as bankrupt; for then doth our tongue naturally and praisably utter her meaning when she borrowith no counterfeit-ness of other tongues to attire herself with all, but usith plainly her own with such shift as nature, craft, experience, and following of other excellent, doth lead her unto. And if she want at any time…" that is, in being in lack at any time — "As being unperfect she must, yeah let her borrow with such bashfulness that it may appear that if either the mold of our own tongue should serve us to fashion a word of our own, or if the old denisoned words could content and ease this need, we would not boldly venture of unknown words." Now, that's wonderful, isn't it?  

Thomas Cable, Professor of English, University of Texas-Austin; Author, A History of the English Language. Source: COTC Interview - http://www.childrenofthecode.org/interviews/cable.htm#LanguageProtectionism

The Power of English

Dr. Thomas Cable: You said that the reason you think English has become so powerful is because it allows ambiguity. The reason English has become as powerful as it is is because of Donald Rumsfeld and his predecessors.  

David Boulton: The language imperialists?

Dr. Thomas Cable: Right. We rule the world and before us the Brits did, so that's why English is as powerful as it is. Now, to be more nuanced about it, I think that ambiguity does allow English to do some things that French, for example, doesn't. I'm thinking now about, oh, the receptivity to new words which the French are always on their guard about. So if you look at a bilingual English and French dictionary, the English dictionary will be — if you have them in two volumes, will be half again as large as the French dictionary. Now, that's good. In the colonies, new words came in in a way that they were not allowed in French. So I'm sort of contradicting myself. I'm no expert, certainly, on this. But it does seem that the French were successful in repelling these intruders, whereas English was more welcome.

Certainly at the lexical level, we've got a lot of ambiguity. You were, I'm sure, thinking of other kinds of ambiguity as well. But yes, the ambiguity in spelling probably does allow for more creativity.   

Thomas Cable, Professor of English, University of Texas-Austin; Author, A History of the English Language. Source: COTC Interview - http://www.childrenofthecode.org/interviews/cable.htm#ThePowerofEnglish


Share this:          

There is no substitute for your first-person learning.

SIGN UP HERE to receive general updates about our project or news of future interview releases 

Note about interviews: Participation in a Children of the Code interview does not constitute or imply an endorsement of the Children of the Code project or documentary by the interviewee. Conversely, including an interview does not constitute or imply an endorsement of the views, organizations, books or products of the interviewee, other than as explicitly stated, by the Children of the Code Project and documentary.  
 


For more information about
Children of the Code events
please click here
or call:
502-290-2526


Dr. Grover (Russ) Whitehurst  Director, Institute of Education Sciences, Assistant Secretary of Education, U.S. Department of Education
Dr. Jack Shonkoff Chair, The National Scientific Council on the Developing Child; Co-Editor: From Neurons to Neighborhoods
Dr. Edward Kame'enui Commissioner for Special Education Research, U.S. Department of Education; Director, IDEA, University  of Oregon
Dr. G. Reid Lyon  Past Director, National Institute of Child Health and Human Development (NICHD) at the National Institutes of Health (NIH)
Dr. Keith Stanovich  Canadian Chair of Cognitive Science, University of Toronto
Dr. Mel Levine Co-Chair and Co-Founder, All Kinds of Minds; Author: A Mind at a Time, The Myth of Laziness & Ready or Not Here Life Comes
Dr. Alex Granzin  School District Psychologist, Past President, Oregon School Psychologists Association 
Dr. James J. Heckman Nobel Laureate, Economic Sciences 2000; Lead Author: The Productivity Argument for Investing in Young Children
Dr. Timothy Shanahan President (2006) International Reading Association, Chair National Early Literacy Panel, Member National Reading Panel
Nancy Hennessy  President, 2003-2005, International Dyslexia Association
Dr. Marilyn Jager Adams Senior ScientistSoliloquy Learning, Author: Beginning to Read: Thinking and Learning About Print
Dr. Michael Merzenich Chair of Otolaryngology, Integrative Neurosciences, UCSF;  Member National Academy of Sciences
Dr. Maryanne Wolf Director, Center for Reading & Language Research; Professor of Child Development, Tufts University
Dr. Todd Risley  Emeritus Professor of Psychology, University of Alaska, Co-author: Meaningful Differences
Dr. Sally Shaywitz  Neuroscientist, Department of Pediatrics, Yale University, Author: Overcoming Dyslexia
Dr. Louisa Moats  Director, Professional Development and Research Initiatives, Sopris West Educational Services
Dr. Zvia Breznitz Professor, Neuropsychology of Reading & Dyslexia, University of Haifa, Israel 
Rick Lavoie Learning Disabilities Specialist, Creator: How Difficult Can This Be?: The F.A.T. City Workshop & Last One Picked, First One Picked On
Dr.Charles Perfetti Professor, Psychology & Linguistics; Senior Scientist and Associate Director, Learning R&D Center, U. of Pittsburgh, PA
Arthur J. Rolnick Senior V.P. & Dir. of Research,  Federal Reserve Bank of Minneapolis;  Co- Author: The Economics of Early Childhood Development  
Dr. Richard Venezky  Professor, Educational Studies, Computer and  Information Sciences, and Linguistics, University of Delaware
Dr. Keith Rayner  Distinguished  Professor, University of Massachusetts, Author: Eye Movements in Reading and Information Processing
Dr. Paula Tallal  Professor of Neuroscience, Co-Director of the Center for Molecular and Behavioral Neuroscience, Rutgers University
Dr.John Searle  Mills Professor of the Philosophy of Mind and Language, University of California-Berkeley, Author: Mind, A Brief Introduction
Dr.Mark T. Greenberg Director, Prevention Research Center, Penn State Dept. of Human Development & Family Studies; CASEL Leadership Team
Dr. Terrence Deacon  Professor of Biological Anthropology and Linguistics at University of California- Berkeley
Chris Doherty  Ex-Program Director, National Reading First Program, U.S. Department of Education
Dr. Erik Hanushek Senior Fellow, Hoover Institution, Stanford University

Dr. Marketa Caravolas Director, Bangor Dyslexia Unit, Bangor University, Author: International Report on Literacy Research
Dr. Christof Koch Professor of Computation and Neural Systems,  Caltech - Author: The Quest for Consciousness: A Neurobiological Approach
Dr. Guy Deutscher Professor of Languages and Cultures of Ancient Mesopotamia, Holland; Author: Unfolding Language
Robert Wedgeworth  President, ProLiteracy, World's Largest Literacy Organization
Dr. Peter Leone  Director, National Center on Education, Disability and Juvenile Justice
Dr. Thomas Cable  Professor of English, University of Texas at Austin, Co-author: A History of the English Language
Dr. David Abram Cultural Ecologist and Philosopher; Author: The Spell of the Sensuous
Pat Lindamood and Nanci Bell  Principal Scientists, Founders, Lindamood-Bell Learning Processes
Dr. Anne Cunningham  Director, Joint Doctoral Program in Special Education, Graduate School of Education at University of California-Berkeley
Dr. Donald L. Nathanson  Clinical Professor of Psychiatry and Human Behavior at Jefferson Medical College, Director of the Silvan S. Tomkins Institute 
Dr.Johanna Drucker  Chair of Media Studies, University of Virginia, Author: The Alphabetic Labyrinth
John H. Fisher  Medievalist, Leading authority on the development of the written English language, Author: The Emergence of Standard English
Dr. Malcolm Richardson   Chair, Dept. of English, Louisiana State University; Research: The Textual Awakening of the English Middle Classes  
James Wendorf  Executive Director, National Center for Learning Disabilities
Leonard Shlain Physician; Best-Selling Author: The Alphabet vs. The Goddess
Robert Sweet  Co-Founder, National Right to Read Foundation

FULL LIST OF OVER 100 COMPLETED INTERVIEWS

HELP US: If you appreciate this interview and the work of the Children of the Code project please take a few minutes to share your thoughts with us. Your comments and feedback not only help us learn to better serve you, they help us get the support we need to continue the project.  Click here to read what others are saying about the Children of the Code Project.

SIGN UP HERE to receive general updates about our project or news of future interview releases 

The Children of the Code is a Social Education Project and a Public Television Series intended to catalyze and resource a social-educational transformation in how we think about and, ultimately, teach reading. The Children of the Code is an entertaining educational journey into the challenges our children's brains face when learning to read. The series weaves together archeology, history, linguistics, developmental neuroscience, cognitive science, psychology, information theory, reading theory, learning theory, and the personal and social dimensions of illiteracy. 


CONSIDER HELPING OUR WORK BY CLICKING HERE


THANK YOU!




 

Copyright statement:  Copyright (c) 2014, Learning Stewards, A 501(c)(3) Non-Profit Organization, All Rights Reserved. Permission to use, copy, and distribute these materials for not-for-profit educational purposes, without fee and without a signed licensing agreement, is hereby granted, provided that "Children of the Code - www.childrenofthecode.org"  (with a functioning hyperlink when online) be cited as the source and appear in all excerpts, copies, and distributions.  Thank you. (back to top)