John H. Fisher - The Origins of the English Writing System and the Roots of its
Letter - Sound Confusions
                     

Index:

The Emergence of Standard English
Writing Standardizes Oral Language
Vocabulary Develops Through Writing
1400 The Beginnings of Standard English
French and Latin
English Sounds Transcribed into Latin
Not Enough Latin Letters for English
Alphabet Wasn’t Created for English
The International Phonetic Alphabet
Reading: Phonics & Whole Language
Nobody Cared About English
The Chancery Scribes
King Henry V Father of Written English
Gratitude for Dr. Richard Venezky

John H. Fisher is a Medievalist, author, and authority on the standardization of the English language. He has written many books including  The Emergence of Standard English,  The Importance of Chaucer, The Complete Poetry and Prose of Geoffrey Chaucer and several others. He was professor emeritus of English at the University of Tennessee until retiring in 1996. 

During our interview with Dr. Richard Venezky he mentioned that the standardization of written English began in the the 1400s. In an email exchange following the interview Dr. Venezky indicated that the expert to consult on standardization was John H. Fisher. We interviewed professor Fisher by phone at his home in Tennessee on January 29, 2004. He was an absolute delight to talk with.  We subsequently taped a video interview with him. The following is from our phone interview. We will append it with the transcript of our video interview when it becomes available. 


Share this:          


See copyright policy for info on the free use of this content

The following transcript has not been edited for journal or magazine publication (see 'Interview Notes' for more details). Bold is used to emphasize our [Children of the Code] sense of the importance of what is being said and does not necessarily reflect gestures or tones of emphasis that occurred during the interview.

The Emergence of Standard English:

David Boulton:  How did you come to your passion and interest for understanding the emergence of Standard English?

John H. Fisher:  Well, I am a Medievalist. I teach Old English and Middle English and Chaucer. I did a great deal of work in the Middle English period with the manuscripts. And what was very interesting was to see that after the Norman Conquest English became nothing but a ‘patois’ (a peasant language without a writing system.) Old English was pretty well standardized, but after 1066 all standardization disappeared and what you had was dialects. English was standardized by people who were trained to read and write in Latin. When they began to write down English they wrote it down with the equivalence of the Latin alphabet; the sounds of A and B and C as in Latin but they applied that to the pronunciation of their own language.

So, there was no English standard language from let’s say 1100 to 1400 and by the time you get to Chaucer you begin to have a little bit of standardization. This was very interesting to me. I taught for forty years at New York University and other places and what I became convinced of as I learned something about the appearance of standard language at the beginning of the Renaissance in France and in Spain and other places was that the standardization of the written languages always came 100-200 years before any motion towards standardizing spoken languages.

Writing Standards Create the Standards for Oral Language:

David Boulton:  So, the written languages came into standardization first and became the frame of reference that would lead to the standardization of oral language.

John Fisher:  Well, I wasn’t looking for any reasons as I was teaching these things. I simply observed.

David Boulton:  This is what emerged from your research.

John Fisher: That’s right. In any of these languages, in Spanish, Italian, French, and English, the written language becomes standardized before the spoken language does and I wondered why this was the case. The second thing that occurred to me, though this wasn’t systematic, was that written standards can continue to be used all over the country and for many, many years the whole purpose of a written standard is that if I write it in Tennesee you can read it in Hawaii. If I write it down in 2004 somebody in 2054 will be able to read it. This isn’t true of spoken language. You have the Robinson Crusoe business. Unless we can communicate daily we soon fall out and can no longer communicate with each other.

I came to the conclusion that all of the discussion of standardization of language was a discussion of the written forms of language. It had nothing to do with spoken language. We don’t have the spoken language standardized yet. When we say that we’re speaking Standard English what we’re doing is transferring into our spoken vocabulary and syntax the elements of the written language. What is standard in what you and I are talking now is what we get from our writing. If we didn’t know how to write we would have a much harder time communicating because you’re thousands of miles away, your spoken language is already different from mine and would be much more different from mine unless we went to school and learned the written language. 

Vocabulary Develops Through Writing:

David Boulton:  This corresponds very nicely with work that has come out of reading research that shows very clearly that it is the development of the capacity for reading and writing that really creates our common vocabulary.

John Fisher:  That’s right.

David Boulton:  So the reading research comes right behind and parallels you and intersects right in this space (to Reid Lyon).

1400 The Beginnings of Standard English:

John Fisher:  Having decided that standardization must come from the written form of language I began to look much more closely in my manuscripts and my people in the 14th century. It suddenly came through to me something that no history of the language had pointed out before: that the standardization of English and manuscripts written in English, which grew rapidly more and more standard, came at the time of the Lancastrian Rebellion in England. We had dialects up until 1400. Right after 1400 we have the beginning of standardization and this is the time that Henry IV and then Henry V took over the power in England. If you read the history of the time what you find out is that these people wanted to gain the support of the common people in England because all histories of English Parliament say that Parliament begins about 1400. Up until that time it isn’t important. From 1400 on it becomes more and more important. 

If the rulers, the two Henrys, Henry IV and Henry V, wanted to get support from the common people they would have to change the language of Parliament from French to English.

David Boulton:  So they could communicate with them.

French and Latin:

John Fisher:  English continued to be the spoken language but everything written until 1400 was in French and in Latin. So you can see how it went from there. These two people trying to appeal to the common people began to use English for law, and for business, and for literature. They began to standardize it, to regularize it as a way of getting support from the common people who had been using French and Latin for these purposes up until that time.

David Boulton:  The common people had been using French and Latin or the intermediate middle class?

John Fisher:  Well, by common people I don’t mean peasants. I mean all of the function in England having to do with trade, having to do with law, having to do with literature, was all in French and Latin. The only things we have in English up until 1400 are patois’ (a clumsy peasant language without a standardized writing system.); you have myths and things like that that people took down from the spoken language. Any kind of business in London until 1400 had to be carried on in French or in Latin. There wasn’t a vocabulary to carry on (business) in English. Consequently, when they began to standardize the English language, fifty percent of the vocabulary came from French and Latin. All of the standard words had to be adopted from French and Latin where they had vocabulary for it because they didn’t have vocabulary for it in English.

David Boulton:  So the French and Latin languages were making distinctions that had become important that the English language hadn’t.

John Fisher:  The Romans had all of these things worked out 2000 years ago. They simply took the terminology from Latin which the church had brought into its business dealings and they simply used this and French. Until 1300 or 1400, people in France thought that Latin was simply the correct way to write French. They thought that French was derived from Latin. And so they wanted to use Latin as the language of record but they talked French. It wasn’t until well after 1300 that they began to write important documents in France in French. And the same thing for Spain. 

English Sounds Transcribed into Latin:

David Boulton:  And in both cases they also had scribes of sorts that were applying the sound system and the meaning system of Latin.

John Fisher:  They all had to learn Latin in school because Latin was the living language. And they knew what the sounds of the Latin letters were. And so when they found the same sounds in the vernacular languages, English or French or Spanish, ‘ah’ was ‘ah’ in Latin and ‘ah’ became ‘A’ (‘ah’) in English. And what they did in a sense was transcribe the sounds of English and Spanish into the sounds that they learned in school for Latin.

David Boulton:  Yes, excellent. Now at this point in time, as I understand it, there was a much more close approximation in the correspondence between letters and sounds within the Latin system.

John Fisher:  Well, they didn’t pronounce Latin the way we do. That’s all been brought back in the 19th century to the sounds that we think Romans used. But they knew what their pronunciation of the letters was in Latin and so they simply made the same sound in French or in English with the letter A or the letter B or C. It’s what the missionaries do when they go to the Polynesian Islands. When they want to translate the Bible into Polynesian they don’t invent a new vocabulary. They simply make the Latin vocabulary that we’re using express the sounds they think they’re hearing in Polynesian languages. We’re doing this all the time. The Bible had hundreds of translations using Latin characters into African dialects, Pacific dialects, Asian dialects. They simply use the Latin alphabet but make the sounds the sounds of Swahili or something like that.

David Boulton:  Right, by coming up with some close mapping between what they hear.

John Fisher:  Right.

Not Enough Latin Letters for English:

David Boulton:  Up to this point when it starts to get standardized and regardless of the fact that the sounds that they may have been using at that time are different than the way we might interpret them now, it seems that the Latin had a tighter correspondence in terms of the number of discrete and distinct sounds in the spoken language and how they mapped to the letters in the alphabet.

John Fisher:  Oh, yes. You see the alphabet was made for Latin. It has never fit very well for English. In order to get the English sounds of ‘huh’ we don’t have a letter for ‘huh’. The Romans didn’t have a ‘huh’ sound and they didn’t have a letter for ‘huh’. So we have to create the combination GH to make ‘huh’. We make it through now. But we had to have SH, GH, and other combinations which they didn’t have to have in Latin because the Latin alphabet was conceived, created for the sounds in Latin. And we have sounds in English that they don’t have in Latin.

David Boulton:  One of my favorites quotes is Plato from saying ‘when we learned the letters of the alphabet we could read.’

John Fisher:  Yes.

David Boulton:  You know, as if reading was simply a case of code cued speech. The letter popped the sound into their minds and they ran them together and that was that.

John Fisher:  Yes.

Alphabet Wasn’t Created for English:

David Boulton:  But today the struggle that our children face in learning to read is that there’s a lot of, at least as perceived by them, confusion and ambiguity in the relationship between letters and sounds.

John Fisher:  Because the alphabet wasn’t created for English.

David Boulton:  Right. And these scribes that were mapping the alphabet to the English spoken language weren’t very concerned about the ecology of processing the resulting code for the masses of humanity that would come later.

John Fisher:  Well see, everybody who knew how to read and write until about 1600 learned to read and write in Latin and then tried to apply that to English. Nobody learned to read and write in English until well after the Renaissance.

David Boulton:  Until after King James.

John Fisher:  Yes.

David Boulton:  So that’s one of the points that we’re most interested in our series.  If we step back from the history for a moment and we just look at it in terms of code processing, the challenge that children are faced with today is a challenge that the human brain never really experienced before, which is to read in these code elements and rather than being able to immediately express them as sounds, having to, in computer terms buffer them, hold onto them, and instead of reading being something that can be totally produced in response to the code, the code now has to be interpreted by processing what the reader has already comprehended as well as the spelling and phonical rules that evolved to try to correct for this confusion.

John Fisher:  Yes.

David Boulton:  And that is an incredibly complex technological brain process that four and five year old children are struggling to learn and it’s all but fating their lives.

 

The interview you are reading continues after the following message...  (skip message)


Share this:          

Please sign up to receive notices when we post new articles, videos, and updates.
Sent by Google's FeedBurner. Easily unsubscribe at any time. Never any spam.

The International Phonetic Alphabet:

John H. Fisher:  Well, you see if we created a new alphabet like the IPA, the International Phonetic Alphabet, it would work much better for writing English in the Latin alphabet, but we’d have another problem. The sounds that were made in Atlanta, Georgia would be very different from the sounds made in Boston. And so you couldn’t have an alphabet that would do the Atlanta sounds very nicely, but again, if you went up and tried to use that in Boston it wouldn’t work very well.

David Boulton:  Yes. It might work better than what we’ve got but it still would bump into problems because of the sound differences.

John Fisher:  That’s right.

David Boulton:  We have researched both Benjamin Franklin’s work, Noah Webster’s work, Melville Dewey, the whole simplified spelling thing that had Darwin and Twain and Bernard Shaw and the work by Alexander Bell’s father. So many people have tried to address this problem and always hit their head against the wall when it came to the institution saying ‘look we’re not going to change our libraries because some kids are having trouble reading’.

John Fisher:  Yes.

Reading: Phonics & Whole Language:

David Boulton:  So we’re not advocating changing the alphabet or changing the spelling, but what we are trying to do is put the challenge that these children are experiencing in perspective. That it is a technological code processing challenge unlike anything ever experienced before because it has to happen so much faster than kids can think about it.

John Fisher:  For a long time, I’m out of the business of elementary education now, but for a time educators felt that it didn’t help the children at all to know the sounds of the alphabet, that they should look at the word the and know it was the. They’d look at the word me and know it was me. They wouldn’t try to sound it out.

Until 1900, through 17th, 18th, 19th century, people thought that the way to learn to read was to read aloud. But most of the German philologists said reading aloud prevents you from reading easily because the alphabet doesn’t match the sounds. Let’s just recognize what the word is and what the sound of the word is without worrying what letters went into it. I think they’ve come back and they’re not doing just the look-see now, but for a long time in there in the 1920’s and 30’s and 40’s they didn’t want people to be able to sound out the letters.

David Boulton:  Right and my understanding is that phonics emerged in the mid-1600’s as a way to try to teach around the confusions in the code.

John Fisher:  Right.

David Boulton:  And then in the late 1700’s the Germans you’re speaking of inspired Horace Mann and his group here in America to move towards what would become the whole word method. And these two have been back and forth for a hundred and something years, most recently coming to realize that while whole language represents an easier take off,  it does not exercise the mind into making the kind of structural unit distinctions that’s necessary to be able to go on and decode and read words that you weren’t exposed to and trained to have an association with.

John Fisher:  And you know this association works very well for a simple statement. But once you get into the kind of difficult words that we use in science or chemistry or things like that you have to be able to sound them out. You can’t just recognize ‘oxygen’. 

Nobody Cared About English:

David Boulton:  Exactly, I so appreciate the opportunity to talk with you and to hear the wisdom that you’ve gained through your work on all of this. Let’s go back in time to the period of the Chancery scribes and in particular, one of the points that we want to get at is to what extent was anybody really minding the store?

We have these two language systems, this common people spoken English, like you were saying, and then we have the Latin and French influences on the beginnings of the writing system. And then we have people who are oriented to Latin and/or oriented to the French language starting to transcribe English into written form and bringing all of the biases of those two language systems to the process. And when the Latin writing system hits the English oral language we’re talking about having only twenty-four letters to represent over forty-something sounds. There wasn’t any cognitive scientists or neuroscientists or psychologists or child development experts; there wasn’t any concern for hundreds of millions of people. There were a small number of people bringing this standardization in. I want to understand that.

John Fisher:  Well, you see, until 1700 nobody cared about English. It was a patois (a clumsy peasant language without a writing system) and nobody cared about how these sounds were produced. It’s until you get Dr. Johntson, Priestley and the people who wrote the first grammar dictionary after 1700 that people began to think about the difficulties of writing English. Until that time it just grew like Top seed. But it was standardized because all of the people in government, all of the people in law wanted to communicate. If they wrote the law in London they wanted people in Bristol to be able to understand what it was.

David Boulton:  So, it was the same logic as King James wanting to have a Bible that the people could read.

John Fisher:  That’s right. Exactly.

David Boulton:  So, in order to unite the people, in order to create a power base, in order to have what we might call a distributed conversation that’s possible in a larger consistency you’ve got to start with a common frame of reference which is the written language.  

John Fisher:  The people who created that King James language were not linguists. They didn’t think about sounds, they didn’t think about inflectional endings, things like that. They simply wanted a practical, useful, means to communicate.

David Boulton:  And the point is that all the years later so much of the struggle can be traced back to how these languages fused together.    

John Fisher:  Well, we still spell through T-H-R-O-U-G-H. We haven’t pronounced it through since Chaucer’s time. But we still use the spelling.

David Boulton:  So, is there anything else that we could say that would provoke or help people get more of an insight into how it is that we evolved this mapping of twenty-four letters into forty-four sounds; the process of that and what was the role of the scribes in standardizing the relationship between letters and sounds?

The Chancery Scribes:

John Fisher:  Well, I think it might be useful for people to realize that the standardization that began under King Henry V was a purely practical operation. He had twenty-six scribes who wrote the royal documents. And in the Chancery at that time there were about 220 scribes who did practically all the important writing in England. I have some figures on this in the book and you might want to look at it sometime, but it was about 250 people under Henry V who began to use English for legal purposes, for philosophical purposes, for educational purposes.

It was a very, very small group of people and they all worked in the same offices and they all had the same kind of training and so they created a language which if you knew the scribal language of England you could use anywhere. We can look at the manuscripts in 1440-1450 and we can tell immediately if this person had been trained in the scribal school. They had their houses, scribal houses in London. They lived together. If he had been there for a couple of years you could read his writing. If he hadn’t been there for a couple of years it’s very hard to figure out what he’s saying.

David Boulton:  So they wrote for a very small elite.

John Fisher:  Oh yes. The group who could read and write were perhaps 5000 people in all of England. This is why we talk about the ‘ars dictaminis’ (art of dictation).  Most of the writing that was done until 1500 was done by dictation. There were scribes. You called the scribe in and told them what to say and he wrote it down for you. Then it went to the person who was supposed to receive it. That person called his scribe in, the scribe read it to him and so it was from the dictator to the auditor. We still use the terms dictator and auditor.

David Boulton:  Yes. It’s a former technology that actually models over to the way we encode and decode technological messages today.

John Fisher:  Exactly.

David Boulton:  The people that came in this scribal school at the time that this was happening, the people that would have been inside these scribal schools and doing the scribal work on both sides of this were people that had come up through Latin.

John Fisher:  They would all have learned to read in Latin. Nobody went to school to learn English.

David Boulton:  So the people that formulated what would become the written English language had their original training and their mental biases in Latin first and French second. Is that right?

John Fisher:  Yes. And you know within the last fifty years unless you knew Latin you weren’t really educated.

David Boulton:  Right, it was the language of describing distinctions in science.

John Fisher:  Yes.

David Boulton:  I’ve also heard of subsequent issues like the cost of making type fonts having effects on what characters would be standardized later by the printing press process on the other side of this major event that we’re talking about now. I’m still trying to get a sense of what kind of awareness or intelligence was brought to bare on developing this mapping between the alphabet and English language sounds. What I’m gathering is that there really wasn’t any great degree of consideration about how to do that. Is there any record of the dialogue going on amongst the scribes about how to represent sounds or how to structure the language?

John Fisher:  There are a few mutterings about it in the 15th and 16th century. Do you know Albert Baugh, the History of English Language?

David Boulton:  I’m familiar with it and I recognize the name from my scan of your book last night.

John Fisher:  Baugh is very good on what you’re talking about now, the adaptation of the Latin alphabet to English. He has good things to say about it. He was my professor and he is dead, but the latest edition he did with Thomas Cable. He is the person who has taken over the Baugh-Cable edition and his name is on the last edition of it.

David Boulton:  Good, I’ll try to get to him. So my question is was there conscientiousness, an explicit conversation, an intentional thinking through of how to do this? I understand one of the distinctions you’re making in your work is that this just didn’t happen randomly, chaotically, naturally, osmotically. There were a group of people, these scribes that we’re speaking of now, and it is their collective, common behaviors that extruded the beginnings of our written language.

John Fisher:  I agree with you on the behavior. Now whether they sat around and had seminars about how to spell ough since it wasn’t in Latin; well, we know how they did it, but whether they had discussions about whether there were better ways of doing or different ways…who knows?

David Boulton:  I’m interested in that because what it suggests is that this small handful of people, the effects that they’ve had on this language is vast.

John Fisher:  Yes.

David Boulton:  And vastly misunderstood in terms of where our language comes from.

John Fisher:  I don’t know that it’s misunderstood, it’s just not understood.

David Boulton:  Not understood. Yes, okay. Better said, I agree.

Is there anything else that’s a story inside this space, inside our conversation about the Chancery scribes; perhaps something that struck you as being really interesting about all of this that you could share?

King Henry V – The Originator of Written English:

John Fisher:  Another person that you might want to talk to about it is a person who wrote his dissertation with me who found out that Henry V had a direct connection with it. His name is Malcolm Richardson and I believe he is now at Louisiana State University. He is the man who found the documents that Henry V himself had written and I would maintain that since all these scribes worked for Henry V, that Henry V owned written language. If we had to point to one person whose language was the basis for modern standard English it would be Henry V.

David Boulton:  Did he read and write directly personally or did he have scribal interface?

John Fisher:  Oh no, Henry V could read and write himself. He read and wrote in English.

David Boulton:  So he would have been a major influencing factor over what was considered to be the right way to do this.

John Fisher:  I would say so.

David Boulton:  Well that’s really an interesting revelation. So we’ve got a particular King of England, two when you consider King James influence later, that really had a major role in the English language as we experience it today.

John Fisher:  Yes.

David Boulton:  Fascinating.

John Fisher:  And Richardson would know more about this than anybody else.

David Boulton:  Is there anything else that you’d like to share of something that excited or struck you in the course of your research. You’ve been very helpful.

John Fisher:  I think you have touched on most of the things that most interest me. 

Gratitude for Dr. Richard Venezky:

David Boulton:  Okay. Well sir, I thank you so much for your time and this interview. It’s a great pleasure and honor to speak with you and it was fascinating. I’m so grateful to Dr. Richard Venezky for suggesting I speak with you. Do you know Dr. Richard Venezky very well?

John Fisher:  I know who he is, but you know I’ve been out of the profession for so many years.

David Boulton:  It may tickle you to know that he is considered by everybody in reading science today to be the orthographist. He is the top guy, so to speak. If you go to the President’s reading advisor, Dr. Reid Lyon, or you talk to people in universities they all point to Venezky as being the authority on the structure of English orthography.

John Fisher:  Is he retired too?

David Boulton:  He’s still very much active. When you talk with him as I did about how the English written language came to be stabilized he points to you.

John Fisher:  (laughs) Well, I didn’t do much to stabilize it.

David Boulton:  No you didn’t, but you brought something to awareness that has radiated throughout the entire reading community.

John Fisher:  Well, I’m delighted.

David Boulton:  So I wanted to share that with you.          

John Fisher:  Good.

David Boulton:  Thank you so much for your time. I really appreciate it and I enjoyed our conversation. And if it’s okay with you, I may elect to call again if something else comes up that’s a question that might interest us together.

John Fisher:  I expect I’ve given you most of what I can, but I’ll be delighted to talk to you anytime.

David Boulton:  Okay. Thank you so much sir, and enjoy.

John Fisher:  Surely. 


Share this:          

The Children of the Code is a Social Education Project and a Public Television Series intended to catalyze and resource a social-educational transformation in how we think about and, ultimately, teach reading. The Children of the Code is an entertaining educational journey into the challenges our children's brains face when learning to read. The series weaves together archeology, history, linguistics, developmental neuroscience, cognitive science, psychology, information theory, reading theory, learning theory, and the personal and social dimensions of illiteracy. 
 

Let's Stop Reading Shame!


Millions of kids grow up ashamed of their minds because they blame themselves for not being good enough at reading. It's a crime they feel this way and we want your help to stop it. We are raising money to give our DVD sets to the teachers and literacy volunteers that need it the most but can afford it the least. Help us raise funds that will spread the word about and help stop reading shame!  Donate now!
                 

 


 

Note about interviews: Participation in the preceding Children of the Code interview does not constitute or imply an endorsement of the Children of the Code project or documentary by the interviewee. Conversely, including an interview does not constitute or imply an endorsement of the views, organizations, books or products of the interviewee, other than as explicitly stated, by the Children of the Code Project and documentary.  



There is no substitute for your first-person learning.


Click to go to the index of Children of the Code video sequences

Dr. Grover (Russ) Whitehurst  Director, Institute of Education Sciences, Assistant Secretary of Education, U.S. Department of Education
Dr. Jack Shonkoff Chair, The National Scientific Council on the Developing Child; Co-Editor: From Neurons to Neighborhoods
Siegfried Engelmann Professor of Instructional Research, University of Oregon; Creator of Direct Instruction  
Dr. Edward Kame'enui Commissioner for Special Education Research, U.S. Department of Education; Director, IDEA, University  of Oregon
Dr. G. Reid Lyon  Past Director, National Institute of Child Health and Human Development (NICHD) at the National Institutes of Health (NIH)
Dr. Keith Stanovich  Canadian Chair of Cognitive Science, University of Toronto
Dr. Mel Levine Co-Chair and Co-Founder, All Kinds of Minds; Author: A Mind at a Time, The Myth of Laziness & Ready or Not Here Life Comes
Dr. Alex Granzin  School District Psychologist, Past President, Oregon School Psychologists Association 
Dr. James J. Heckman Nobel Laureate, Economic Sciences 2000; Lead Author: The Productivity Argument for Investing in Young Children
Dr. Timothy Shanahan President (2006) International Reading Association, Chair National Early Literacy Panel, Member National Reading Panel
Nancy Hennessy  President, 2003-2005, International Dyslexia Association
Dr. Marilyn Jager Adams Senior ScientistSoliloquy Learning, Author: Beginning to Read: Thinking and Learning About Print
Dr. Michael Merzenich Chair of Otolaryngology, Integrative Neurosciences, UCSF;  Member National Academy of Sciences
Dr. Maryanne Wolf Director, Center for Reading & Language Research; Professor of Child Development, Tufts University
Dr. Todd Risley  Emeritus Professor of Psychology, University of Alaska, Co-author: Meaningful Differences
Dr. Sally Shaywitz  Neuroscientist, Department of Pediatrics, Yale University, Author: Overcoming Dyslexia
Dr. Louisa Moats  Director, Professional Development and Research Initiatives, Sopris West Educational Services
Dr. Zvia Breznitz Professor, Neuropsychology of Reading & Dyslexia, University of Haifa, Israel 
Rick Lavoie Learning Disabilities Specialist, Creator: How Difficult Can This Be?: The F.A.T. City Workshop & Last One Picked, First One Picked On
Dr.Charles Perfetti Professor, Psychology & Linguistics; Senior Scientist and Associate Director, Learning R&D Center, U. of Pittsburgh, PA
Arthur J. Rolnick Senior V.P. & Dir. of Research,  Federal Reserve Bank of Minneapolis;  Co- Author: The Economics of Early Childhood Development  
Dr. Richard Venezky  Professor, Educational Studies, Computer and  Information Sciences, and Linguistics, University of Delaware
Dr. Keith Rayner  Distinguished  Professor, University of Massachusetts, Author: Eye Movements in Reading and Information Processing
Dr. Paula Tallal  Professor of Neuroscience, Co-Director of the Center for Molecular and Behavioral Neuroscience, Rutgers University
Dr.John Searle  Mills Professor of the Philosophy of Mind and Language, University of California-Berkeley, Author: Mind, A Brief Introduction
Dr.Mark T. Greenberg Director, Prevention Research Center, Penn State Dept. of Human Development & Family Studies; CASEL Leadership Team
Dr. Terrence Deacon  Professor of Biological Anthropology and Linguistics at University of California- Berkeley
Chris Doherty  Ex-Program Director, National Reading First Program, U.S. Department of Education
Dr. Erik Hanushek Senior Fellow, Hoover Institution, Stanford University

Dr. Marketa Caravolas Director, Bangor Dyslexia Unit, Bangor University, Author: International Report on Literacy Research
Dr. Christof Koch Professor of Computation and Neural Systems,  Caltech - Author: The Quest for Consciousness: A Neurobiological Approach
Dr. Guy Deutscher Professor of Languages and Cultures of Ancient Mesopotamia, Holland; Author: Unfolding Language
Robert Wedgeworth  President, ProLiteracy, World's Largest Literacy Organization
Dr. Peter Leone  Director, National Center on Education, Disability and Juvenile Justice
Dr. Thomas Cable  Professor of English, University of Texas at Austin, Co-author: A History of the English Language
Dr. David Abram Cultural Ecologist and Philosopher; Author: The Spell of the Sensuous
Pat Lindamood and Nanci Bell  Principal Scientists, Founders, Lindamood-Bell Learning Processes
Dr. Anne Cunningham  Director, Joint Doctoral Program in Special Education, Graduate School of Education at University of California-Berkeley
Dr. Donald L. Nathanson  Clinical Professor of Psychiatry and Human Behavior at Jefferson Medical College, Director of the Silvan S. Tomkins Institute 
Dr.Johanna Drucker  Chair of Media Studies, University of Virginia, Author: The Alphabetic Labyrinth
John H. Fisher  Medievalist, Leading authority on the development of the written English language, Author: The Emergence of Standard English
Dr. Malcolm Richardson   Chair, Dept. of English, Louisiana State University; Research: The Textual Awakening of the English Middle Classes  
James Wendorf  Executive Director, National Center for Learning Disabilities
Leonard Shlain Physician; Best-Selling Author: The Alphabet vs. The Goddess
Robert Sweet  Co-Founder, National Right to Read Foundation

FULL LIST OF OVER 100 COMPLETED INTERVIEWS

HELP US: If you appreciate this interview and the work of the Children of the Code project please take a few minutes to share your thoughts with us. Your comments and feedback not only help us learn to better serve you, they help us get the support we need to continue the project.  Click here to read what others are saying about the Children of the Code Project.

SIGN UP HERE to receive general updates about our project or news of future interview releases 

CONSIDER HELPING OUR WORK BY CLICKING HERE


THANK YOU!



Copyright statement:  Copyright (c) 2014, Learning Stewards, A 501(c)(3) Non-Profit Organization, All Rights Reserved. Permission to use, copy, and distribute these materials for not-for-profit educational purposes, without fee and without a signed licensing agreement, is hereby granted, provided that "Children of the Code - www.childrenofthecode.org"  (with a functioning hyperlink when online) be cited as the source and appear in all excerpts, copies, and distributions.  Thank you. (back to top)