The War Department Papers manuscript transcription project

I have been looking at scanned and digitally stored manuscripts of the War Department Papers project. The U.S. government from the Revolutionary War era Continental Congress, congress under the Articles of Confederation, and from 1789 onwards under the Constitution, had a “War Department” through the late 1940s. The War Department became the Department of the Army located under the Department of Defense.

I started out looking for materials about the “Daniel Shays’s Rebellion” of 1786-1787 in Massachusetts. If the rebels or insurgents had captured the Magazine at Springfield, Massachusetts, they could have taken 7,000 muskets, gunpowder, and other military equipment. “WHAT IF” is an interesting speculation. They were driven off by a state militia force. The incident was one of the “final straws” that got leading Americans and politicians to decided that the Articles of Confederation needed to be “amended” or “completely replaced”. The end result was a convention in Philadelphia that drafted our national Constitution.
I have transcribed 15 documents and am waiting to get a user name and password to send my work in to the WarDepartmentPapers project. I have worked on manuscripts from Thomas Jefferson, Henry Knox, Benjamin Lincoln, John Pierce, James Deblois, a letter to Johnathan Grant, James W. Henry, Samuel Hodgdon, and John Adams.
I have been looking through the documents from 1788, and have picked documents written by John Pierce, Rufus Putnam, Henry Knox, Oliver Wolcott, Jr., Benjamin Lincoln, and Josiah Harmar.
Contact me for more information about the War Department Papers project in general, or the mansucripts that I have transcribed. None of them are posted yet at the W D P website.

Six years of Lindseyt’s blog, milestone or millstone?

20,932 spam comments blocked117 spam comments waiting to be reviewed. Last year, I said that there were over 20,000 comments in the blocked category.
That would have been an average of 4,000 spams a year during the first five years. Less than 1,000 in the past year.

Is the decrease a permanent trend? Are blogs obsolete, and not even spammers are bothering to post to them? What are your opinions? How can I find information as to the number of currently active blogs, and the number of blogs created year by year in the past. How many new blogs are being created in 2013?

I think I know why the NSA and others believe that it is necessary to monitor so much internet transmission

Not sure why my message explaining why I think that I know why “THEY” think that it is necessary did not get posted.  I am not going to repost it; I have other things to tell you about in other posts.  Sorry for the inconvenience if you were looking for my explanation.

My last work day at the University of Texas at Arlington was December 31, 2012.

Thanks again to Brian Erickson

Brian has been helping me ready some late 1700s quill pen manuscript letters that I am trying to transcribe.  It is great to have another person work on the puzzle.  We found some VERY interesting material about Daniel Shays, leader of the Daniel Shays Rebellion, and will try to work on more of them.

Finding a topographic map in the University of Texas at Arlington Library Floor 2 Collection

Finding a topographic map in the Government Documents Map Collection, University of Texas at Arlington Library

The topographic maps are filed in folders using two different classification number series.  The original series used the Superintendent of Documents classification number stem I 19.81: latitude/longitude number digits of the lower right corner of the map.  I 19.81: 32097 contains maps of Tarrant County, Texas.

The second series, for any map that depicts any part of a National Forest or National Grassland, inserts a National Forest/National Grassland “Cutter Code” and the year of publication before the latitude/longitude digit number.   The Kisatchie National Forest in Louisiana (cutter code K 62) has land in 7 parishes, and covers multiple latitude/longitude quadrangles.

This series begins starts with a few maps in 1992, and continues through 2008.  More than 1,400  revised topographic quadrangle maps are arranged in Forest Service cutter number folders   Some of the early year maps may be interfiled with the maps of the digit only folders.

Ask the librarian in Room 215 for help in determining if the library may have a map sheet in the Forest Service Cutter  code map folders, I 19.81: <cutter code/year/   that supersedes a map arranged by latitude/longitude quadrangle.

Next, visit the Geonames server of the U.S. Geological Survey and choose the Search Domestic Names phrase.  The next web page will be a search form.

The form has search blocks for Feature Name, State, County, Feature ID, Feature Class, and Elevation.  Here are some common feature classes to use in a search to determine the latitude,longitude, and map sheet name.

  1. Airport for a named airport or airfield.
  2. Bridge for an officially named bridge.
  3. Cemetery  for a named cemetery that is at least 50 feet by 50 feet in dimension.
  4. Census  for a Census Designated Place of the Census Bureau.  The Woodlands, Texas is a Census Designated Place with more than 80,000 residents.
  5. Church for a church in a rural sparsely populated area, or a prominent building in an urban area
  6. Civil for a county or parish name.  This search will return a list of all topographic map sheets covering the county or parish.
  7. Dam
  8. Forest.  Type in the name of a National Forest such as Mark Twain National Forest, and a record will be returned.  Click on the feature name, and the next result page will include the names of counties and coordinates and map sheet name for each map depicting part of the forest.  This may also work with many state forests.
  9. Lake.  Also try reservoir.

10.  Oilfield.

11.  Park.  Use for local parks, state parks, and stadiums.

12.  Populated Place.  Use for incorporated cities and towns.

13.  Post Office.  May work for heavily populated areas, but not in rural areas.

14.  Reservoir.  Use in conjunction with Lake.

15.  Stream.  Use with any linear body of water.  The Mississippi River is a “stream” in this system.

16.  Summit.  Search for mountains or hills using this feature.

17.  Woods.   Can be used to search for a named forest, a thicket, or a named woods.

The Feature record in the Geonames server will include a latitude and longitude, and a map sheet name.

The library’s 7.5 minute series 1: 24000 scale maps are in map cabinet folders with classification number codes that begin I 19.81: latitudenumberlongitude number.  Dallas County is in folder    I 19.81: 32096, Tarrant Count in folder   I 19.81: 32097,  and Denton County in folder I 19.81: 33096.

The library has some maps at the 1:25000 and 1:50000 scale in folders I 19.81/2:  .  Most of these maps are for Texas Quadrangles.

The Bureau of Land Management publishes maps of the United States west of longitude degree 103 (New Mexico, Colorado, Wyoming, and Montana, and states to the west), and  North Dakota and South Dakota west of longitude degree 101.  There are two map series, assigned Superintendent of Documents classification numbers I 53.11/4 and I 53.11/4-2 .

A 2007 Forest Service Table is available to identify Congressional Districts and Counties containing U.S. Forest Service land.  Some forests and grasslands are in multiple counties and multiple Congressional Districts.

Visit U.S. Forest Service Table 6 – NFS Acreage by State, Congressional District and County

Table 6 – NFS Acreage by State, Congressional District and County

Minutes of the North Texas Documents Group May, 2012 meeting.

North Texas Documents Librarians Group

Minutes and Notes for the May 17, 2012 Meeting.

Attendees:  Thomas Lindsey, University of Texas at Arlington; Julia Stewart,  Southern Methodist University;  Brenda Mahar, University of Texas at Dallas; Charlotte Bagh, Dallas Public Library; Jenne Turner, University of North Texas

Location:  Arlington Museum of Art, West Main Street, Arlington, Texas.

What’s happening at each library

Southern Methodist University:

Julia Stewart said that the Deselect feature of the Amendment of Item Selections system at the FDLP homepage site was not working.  [This may have been related to the change in system which now allows for additional selections to be made at any time, but additions continue to become effective the first business day of October. ]

The S.M.U. Library Assistant Dean, Pat Van Zandt, is moving to East Tennessee State.

The search for Dean of Hammond Library has been suspended.  Julia is reporting to a person who has just arrived at the library.

The Dean of the Library wants a renovation project started by 2013 so that it would be a university centennial project.  The Government Documents area will be moved to another location.

A new curriculum starts at S.M.U. in Fall, 2012.  The librarians are trying to meet professors to learn about what they can do to assist faculty and students.

University of North Texas:

Jenne Turner said that Jesse Silva of the University of California at Berkeley will become the new head of government documents.  He will start in August.  The university is soliciting money for the new building addition to the current Willis Library.  There are five employees in external relations at the library, and three more employees for the rare book collection.

A U.N.T. government documents librarian is auditing 2 courses in business each semester.

Dallas Public Library:

Charlotte Bagh said that the Interim Director is moving to Chattanooga, Tennessee to become its director.  A new branch in the White Rock Lake area is the “White Rock Hills” branch.

The Dallas Public Library budget for the next fiscal year seems to be okay.  Staff cuts and mandatory furlough days for city employees started with the 2010 fiscal year budget.  There is a possibility that both will be eliminated.

One of the most heavily used areas at DPL is the “job seeker center”.  Julia  of SMU said that she thinks that the SMU library has a number of people who have given up job searching and are just hanging out.  Julie of UNT said the Willis Library has community users who may people who are turning off air conditioning at home to reduce electric power bills, and are spending the day at the library.

University of Texas at Dallas [Brenda Mahar]:

A large conversion project that is withdrawing print publications and searching for electronic links is underway.  UTD had submitted 2012 disposal list number 90 at the time of this meeting.

Acquisitions and Electronic Resources were merged.  A section of the Tech Services area will be renovated.

The library is using Voyager, but is upgrading the Alma, the next system.

A part time librarian is going to the school of undergraduate education.  Two librarians are spending some of their time at the School of Management.  [Julia of SMU said that SMU Engineering Librarians are doing something similar.]

University of Texas at Arlington [Tom Lindsey]:

An inventory of the print publications is being done by the Metadata Services department, which does cataloging and work with metadata.  All publications not found in the catalog are brought to their area, where Tom Lindsey makes decisions about what to keep and catalog, or withdraw.  Many previously unknown publications are being found.  Tom Lindsey is contacting liaison librarians and faculty members to notify them of these previously “invisible government publications”.

The group discussed the state forecast and survey that we are sending to GPO and to our regional depository libraries at Texas Tech and the Texas State Library.

Thomas Lindsey, University of Texas at Arlington Library, recorder, posted to blog lindseyt’s blog http://blog.uta.edu/~lindseyt on August 23, 2012.

Five years of lindseyt’s blog, milestone or millstone?

I received a message in cyrillic text (Russian language).  I will use a dictionary to translate the words into English.

At the moment, I know that it refers to my first blog entry, created on August 13, 2007.   Has it really been 5 years?  I thank the WordPress system for transferring more than 20,000 messages directly into a “spam folder”.

I have never been sure if any real human beings read any of my messages.  I have taken the text of many replies and searched for them on other blogs, and frequently find the same message as was sent to me.  I wish that I could remember the exact statement of another internet message poster, but it was something like this:  “The Internet and the World Wide Web have demonstrated that this hypothesis is wrong:

If you take a large group of monkeys, put them in front of typewriter keyboards and let them randomly type away, none of the work of Shakespeare would be eventually found in their typewritten pages.

WhatHAS been proven are the statements in a book by Elaura Niles:  Some writers deserve to starve:  31 Brutal Truths about the Publishing Industry.  Cincinnati, Ohio:  Writers Digest Books, 2005.  ISBN 1582973547

My time, and the time that my employer pays me to work, are better spent helping other people advance their level of knowledge toward award of college degrees, to develop skills in critical thinking, and to do other work that makes our planet and human civilization better because we have lived.

The Future of Automatic Data Processing (1967 Speech)

I have been reviewing our federal government publications collection, and found a set of volumes from the former “Industrial College of the Armed Forces”, which is based at Fort Lesley McNair, Washington, D.C.  It operated under the Joint Chiefs of Staff of the military services.  The National Defense University is its successor.

I have scanned several articles from the 1960s for forwarding to other staff members.  The article below is one of them.  I am in the processing of manually cleaning up the scanning job and putting paragraphs in correct order.  The article starts on page 19 and ends on page 25.

Scanning devices and their systems are marvelous, but the last sentence of this article is still relevant 47 years later:

“In short, our reliance on the computer is still far from total, and I do not think that it will ever be.”

I will keep correcting this article and putting paragraphs in proper order.  The “Belfcoram, Inc.” company listed as the speaker’s employer was actually Bellcom, Inc.   –TKL

February 1967

PERSPECTIVES
IN DEFENSE MANAGEMENT
INDUSTR IAL COLLE GE
OF THE ARMED FORCES
[University of Texas at Arlington Library
Government Documents  rubber stamp imprint on cover]
—————-
INDUSTRIAL COLLEGE OF THE ARMED FORCES
WASHINGTON,D.C.
AUGUST SCHOMBURG, Lieutenant General, USA
Commandant
J. J. ApPLEBY,Rear Admiral, USN
Deputy Commandant
The Industrial College of the Armed Forces is a joint educational
institution operating under the direction of the
Joint Chiefs of Staff and is the capstone of our military educational
system in the management of logistic resources for
national security.
PERSPECTIVES IN DEFENSE MANAGEMENT
PUBLISHED BY THE INDUSTRIAL COLLEGE OF THE ARMED FORCES • FEBRUARY 1967
“PERSPECTIVES”
Lieutenant General August Schomburg, USA. …………………………. ii
THE ROLE AND RESPONSIBILITY OF MANAGEMENT IN THE AMERICAN ECONOMY
Laurence I. Wood. . . . . . . . . . . . . . . . . I
FOSSIL FUELS AND UNITED STATES STRENGTH
Bruce C. Netschert. . 11
TIlE FUTURE OF AUTOMATIC DATA PROCESSING
Isaac D. Nehama …. ………………………………. 19
THE EXECUTIVE OFFICE OF THE PRESIDENT AS A MANAGP.MENT TOOL
Donald]. Carbone . 27
COST-EFFECTIVENESS ANALYSIS OF COMMAND AND CONTROL SYSTEMS
Captain Robert L. Baughan, Jr., USN … 39
Perspectives in Defense Management is a representative selection of current presentations
and papers drawn from the educational programs of the Industrial College of the Armed
Forces. It is published by the College on an Infrequent, nonperiodic basis.  Each issue is
distributed initially to a limited· list of present and former Industrial College faculty members
and students; Industrial College lecturers, panelists, and consultants; officials of the Department
of Defense, Military Service departments, and other Government agencies: major
Service and joint commanders; mllltary and-other Government schools and colleges; Defense oriented
trade and professional associations and business firms; civilian colleges and universities
and libraries. ‘Additional copies are available in limited number on request.
The views or opinions expressed or tmpl1ed in this publication are the author’s and are
not necessarlly those or any agency of the U.S_ Government, nor of the Industrial College.
Members of the College may quote from the contents only in student reports or publlcations
for use within the College. Other persons may not quote or extract for pubUcation, reproduce,
or otherwise copy this material without specific permission from the author and from
the Commandant, Industrial College of the Armed Forces, in each case. ‘Comments and
suggestions are invited. Please address the Editor, Perspectives in De/ense Management,
Industrial College of the Armed Forces, Fort Lesley J. McNair, Washington, D.C. 20315.

Before I start, we should agree on a common
view of the nature of the computer. Traditionally
the computer has been thought of in the
popular mind as an instrument for doing arithmetical
calculations-very rapidly, to be sure,
but just as an arithmetical calculator. However,
as most of you know, it is possible to encode
information in the form of numerical symbols,
and the computer can perform operations
with these symbols. If the symbols are those
of algebra, then the computer is doing algebraic
manipulations .. If the symbols are pictorialfor
example, from a satellite overflying a
planet-and if the computer uses this information
to reconstruct a picture, then the computer
is a pictorial processor.
Thus, in a broad sense, the computer is an information
machine. It accepts information
from its environment through its input devices.
It processes the information according to the
program stored in its memory, and it then sends
back the information to its environment through
the output devices.
As an information processor the compnter
resembles the human brain. It also resembles
social institutions, in that they all accept information,
process information, and put information
out, although they all do it somewhat
differently, and, incidentally, in ways which are
not perfectly understood.
The notion of the computer as an information
processor is essential to an understanding
of its truly universal nature. It is also indispensable
in trying to gange the impacts the computer
may produce in its future development.
As you know, the digital computer, as hardware,
consists of input-output devices, of memory,
and of arithmetic and control circuits.
19
THE FUTURE OF AUTOMATIC DATA
PROCESSING
ISAAC D. NEHAMA

GENTLEMEN: In the past 15 years of its
commercial life the computer has had a
truly revolutionary impact on society
and on every aspect of human life. I don’t
base my statement so much on the usual statistics
of growth, although these are impressive
enough. For example, in only 15 years the industry
has grown from 1951to 1966with annual
sales from zero to over $2 billion. The number
of computers in this country rose from less than
10 in 1951 to more than 30,000 now. There
were less than 10 applications in the beginning;
now there are over a thousand. There has been
a professional growth of people from 1,000 to
over half a million, and the expenditures by
the Federal Government have risen from something
less than $10 million to over $1.5 billion.
These statistics are very impressive, but at
best they show only the quantitative impact of
the computer on society. However, we know
that the automation of operations in such areas
as banking, inventory control, logistics in the
armed services, engineering design, and air defense
are not merely increasing efficiency but
have brought basic transformations both in the
methods by which operations were conducted
and in the organizations themselves.
Let me use an obvious parallel. The introduction
of the tank, the plane, and the missile
not only increased the mobility of men and
weapons by a factor of 10-an order of magnitude-
but also caused profound changes in the
nature of the services themselves and in the stra-
MR. ISAAC D. NEHAMA, Director, Analysis and Computer
Sciences Division, Belfcoram, Inc., WBSborn in Athens,
Greece. He attended the University of Illinois, where he
received his bachelor’s and master’s degrees In electrical engtneertng.
Mr. Nebama has held poetttone with the Bell System
and the Computer Sciences Department of the RAND Corporation.
His present position involves studies in all areas of
computer technology as they apply to space programs. He
also participates in s.rstems engineering support for the
National Aeronautics and Space Administration Office of
Manned Space Flight.
This talk, published here in condensed form with the
author’s approval, was presented to the College on 29 September
1966.
20 PERSPECTIVES IN DEFENSE MANAGEMENT
Equally essential in completing the portrait of
the computer is the program of instructionsthe
software-because without it the computer
would be a useless collection of electronic parts.
Let me speak now about the anticipated advances
in hardware and software, say, by 1975.
The arithmetic, logical, and control circuits of
a computer are like the central nervous system
III a man. We call that the central processing
unit, or the CPU. The CPU, together with the
memory, is called sometimes or referred to as
the main frame. I will use this term interchangeably.
Let me start with the CPU. A recent development
which introduces a new era in computer
technology is the advent of microelectronics.
This term is used to describe the manufacture
.of electronic circuits of very small size.
Let me Illustrate.
Imagine a circuit board from a present day
computer. It IS a plastic card five inches on
the side. On one side you see a number of electronic
devices-transistors, capacitors, resistors-
about 15 or 18 of them and on the other
side you see a wiring pattern.’ Five or ten years
from now, if I were to show you the same funotionally
equivalent circuit it would be impos-
SIble for you to see it at this distance. It would
be smaller than the head of a pin. This innovation
has three far-reaching implications. The
first ISthat the number of mdividual devices in
the computer will decrease drastically. The
number of intercollllections, today the most vulnerable
part of an electronic system, will also
decrease drastIC,:lIy: Therefore, the. reliability
of the system WIll increass, We anticipate an
llllprovement by a factor of 100 to 1 000 in the
next 10·years as a result of microelee’tronics.
Second, the materiafj, going into fabricating
this very complex CIrcuit are. extremely cheap,
and so. are the manufacturmg processes involved….:..
essentially the same ones that we use
today ,,:h.enwe manufacture a single transistor.
In addition, because of the reduction of the
number of devices, many materials and assembly
steps that we have t’? use today, such as
boards, .ternunals, connectIOns, connectors, will
be ellmlllate? or drastically reduced. Therefore,.
costs WIll go down. We anticipate a reduction
by one or two orders of magnitude-a
facto~’ of 10 to 100-1ll the next decade.
Third, .the reduction in the physical size of
these devices J;lrovldes a means of overcoming
a natural barrier to increasing computer speed.
Today’s transistors, for example operate at
speeds of a billionth of a second. ‘we call that
a nanosecOl.ld. At those speeds everyday physical
dImenSions, mches. and feet, hecome significant.
For example, light, or an electrical sio
nal, travels one foot in one billionth of a second,
or in one nanosecond. Therefore, any increase
in computer speed has to come from a reduction
in the distances that electrical signals must
travel-the physical dimensions of the WIre
interconnections. Microelectronics will accomplish
this quite drastically, so that by 1975we
expect improvements in speed in the order of
100 to 1,000.
Let me now turn my attention to the memory
or storage component of the computer, .whatwe
call the internal and peripheral memories. For
the last 10 years the backbone of storage technology
has been the magnetic core. In this
period the performance and cost of core mel!!-
ories has improved by a factor of ~5.or so, p.rlmarily
because of progress m the ability to bu.ild
smaller and smaller cores. In the middle fifties,
a magnetic core had the dimensions of 80 thousandths
of an inch. It was able to operate in
20 or 30 microseconds, millionths of a second,
and its assembled cost was between $1.00 and
$5.00 per bit. Now we are producing cores in
great quantities. They have a diameter of less
than 10 thousandths of an inch, they operate
in half a microsecond, and they cost, when fully
assembled, about a cent a bit.
However, there are fundamental limitations
on further reductions in the size of mall”"etlC
cores.. The only hope for improvement l!l the
next 10 years lies in batch-fabrication techniques,
which will reduce cost, but with only
slight improvement in speed.
The best prospect now for real improvement
in memories is the thin magnetic film. We ha-.:e
known for almost 10 years that thin metallic
films exhibit the magnetic properties required
to store information, and can operate at least
10 to 100 times faster than magnetic cores. But
during this time there were severe problems III
mass fabrication of thin film arrays. It IS only
recently that we have been able to see much hope
for solving most of these problems.
Before attempting to predict the state-of-theart
of memory devices in 1975, it will be profitable
to point out a relation that governs mem-
,?ry assemblies in general. Briefly, as the capactty
(number of bits) of a memory device goes
uJ;l,the speed does down and the cost goes up.
8mce our appetite for large memories is almost
boundless, we have a real impasse. Although
in the next 10 years we can expect improvements
in cost and speed of memory devices in
the order of 10 or 100, the effective improvements
will be much less, since the push for larger
memories is not likely to abate.
This means that for some time to come we
will have to live with the present situationnamely,
a hierarchy of memory devices ranging
from devices which are very fast but of low
capacity, to large, bulk stores of relatively low
,
THE FUTURE OF AUTOMATIC DATA PROCESSING 21
,
speed: thin films, cores, drums, discs, and magnetic
tapes in that order.
Next we consider the input-output (1-0)
component. For most of the last 15 years the
major type of input-output devices have been
electromechanical, such as card punches and
readers, tape punches and readers, and printers.
Compared to CPU and storage (the main
frame), mput-output devices were largely neglected
by computer technology. The reasons
were purely economic.
First of all, the CPU and the memory, being
electronic, were at least a thousand times faster
in the beginning than the input-output devices.
Furthermore, because the CPU and memory
were the most expensive items in the computer
system, it was important to maintain their efficiency.
So the whole effort was directed not so
much toward improving 1-0 devices, as toward
findmg means to make it possible to run the
OP~ and the 1-0 without compromising the
efficiency of the main frame. This was done
through the introduction and refinement of the
“data channel” and “input-out-put processor”-
pieces of equipment which stand between
the CPU and the 1-0. For this reason, man
had to be satisfied with 1-0 devices of limited
richness in displaying information, principally
through printed characters.
~ut two strong forces are currently at work
which will radically improve computer inputoutput.
First, technological advances are pushing the
cost of the CPU steadily downward, as I explained
before, so that it isn’t any longer the
predominant cost factor in the computer system.
Second, we are beginning to realize that if the
computer is to be useful to man as an aid to his
understanding, it ought to be able to communicate
with man through every proven means
known to be essential to human understandingdiagrams,
sketches, pictures, graphs, even the
spoken word. A great deal of work is currently
being done on graphical devices for displaymg
information to man in a great variety
of formats, and also for manual input of similar
information from man to the computer.
One may visualize the input-output device of
the future. Such a device would combine a
conventional typewriter keyboard, and a cathode
ray tube display with a light pen which is
used to point to ‘a given part of the display when
requestmg the computer to perform some operation,
such as an enlargement, or to enter data or
drawmgs.
‘1!’e exploration of graphical input-output
e’lUlpment has barely begun, and its full potential
JS not known yet. I think however it will
be po~sible for you to imagin~ the trerr:endous
new vistas that graphic input-output will present
to man and also the tremendous opportunities
for enriching man-machine communication.
Considerable research is also being done in
artificial speech, and someof the progress made
so far is impressive. [Recording of talking
and singing.] What you heard is a computer
speaking and singing. The machine had only
a text in its memory and a program of how to
translate ‘the written words into spoken syllables.
It may not be quite the most moving
of the soliloquies of Hamlet you have ever heard,
but I think you will agree that it was perfectly
understandable. So I think that within the
next 5 or 10 years voice output from the computer
is very much in the cards.
The opposite process, voice input or recognition
of sl?ooch,is a much more complex problem.
I thmk we are going to have to wait
beyond 19′75for that.
Let me move now to the area of software.
There, too, we expect in the next 10 years to
make considerable progress. However, it is
very hard to give you quantitative measures for
the improvement in programming. Qualitatively,
I would say that the principal effect
would be that it would become a lot easier for
the nonprofessional user to communicate with
a computer.
I want to emphasize the word “communicate,”
because we have agreed to view both man
and machine as information processors. What
you do with a computer is communicate. Since
the transfer of information, whenever man is
involved, always implies the process that we
call language, it is easy to understand why any
progress in computer programming depends almost
entirely on our understanding of human
language.
Now, there are two aspects of language. One
is meaning-what you say, and the other is syntax-
how you say it. For the last 10years most
of the research in mechanical linguistics has
dealt almost entirely with syntax-a-the rules of
how you say things, using p’ropositions that are
very simple iri concept, like those of mathematics
and logic. From this research we
learned that, given a rich syntax, you can build
very complex structures out of the few basic
concepts of a language.
We also learned from this research that the
language—vocabulary plus syntax-which is
best suited to expressing a proposition in a given
field need not be the same language that the
computer “understands.” The job of translating
is a simple mechanical routine, and the
computer is perfectly capable of doing that job
itself.
So, weha ve decided that, instead of trying to
build a smgle, universal language which all hu•
22 PERSPECTIVES IN DEFENSE MANAGEMENT
man beings and all machines would converse in,
we have already constructed many different
languages, and we will be able in the future to
frame languages which are completely natural
to the user, matching his training and his way
of thinking. Moreover, the nonprofessional
user should be able to use the machine very
easily, without knowing or needing to know the
inner mysteries and structure of the computer,
in the same way that some automobile drivers
drive quite well without knowing “what is under
the hood.”
Thus, today it is quite possible to take a person
with only a high-school education and, with
a little training, make of that person a good
programmer, which is a far cry from the situation
which we had in the beginning, when you
had to have a rather eccentric-lookmg individual
with a beard and sandals, who was a combirration
of artist, mathematician, and engineer,
in order to be able to talk to the computer.
I think it is a remarkable achievement that
we have been able, through this linguistic approach,
the emphasis on syntax, to use the compnter
in such diverse fields with dramatic effects.
It is my personal belief, however, that we
have paid a high price for this particular ability.
Since today’s computers cannot tolerate ambiguity,
our mechanical lan~ages, our computer
languages, are very rigid in their syntax.
The rules are quite intolerant of mistakes or
aberrations. Consequently we are not yet capable
of constructing and expressing thoughts
to the computer that are rich in meaning. In
order to do that, I think, there would have to
be a shift in emphasis, in research, from syntax
to meaning, or what is known as semantics. For
this, too, we will have to wait until after 1975.
Now, one of the implications of all of these
advances, both in hardware and in software, is
that the way in which the computer is going to
be used, regardless of application, will have to
change drastically.
Let me deal with the situation very briefly.
If we consider again the main three components
of the computer system, the 1-0, the memory,
and the CPU, their costs in 1975 will be distributed
about as follows: The CPU, no larger
than a shoe box, could be bought for about
$1,200-or, with a terminal, which may be a
combination of typewriter and CRT, approximately
$2,000. For a memory of a billion bits,
which is really not very much-perhaps equivalent
to 100books-you would have to pay $100,-
000. So how can we arrange for everyone to
have his personal computer!
The answer is-8hare the C08t! With, say, a
hundred thousand bits in your personal computer,
at a cost of $2,000, you could share big
chunks of memory in a central location with
many other people, thus reducing the overall
cost. This is the principal motivation for the
type of operation referred to now as “timesharing.”
Under this concept, not only the
memory but also the central processing unit may
be shared. All the user needs is a terminal for
input-output.
Now, for this kind of system to be economically
feasible the cost of the communication links
to the central facility will have to be reduced.
What are the prospects for doing this! I think
that the prospects are good. The available
data show that, by the end of this year, or by the
next year, the nnmber of long-line circuits in the
Bell System that will be carrying data will ~xceed
the number of circuits which carry VOice,
regular telephone messages, so that in a ,:”nse
the computer itself is providinl$’ a strong stimulus
toward the shift of commumcation into data.
Another prospect for lower communication
costs is that certain recent technical advances
make it possible to transmit broad-band data
even as high as 20,000 bits per second over regular
telephone facilities, just ordinary telephone
Wires, which were designed for transmission m
a range of a few kilobits a second.
Finally, the advent of the communication
satellites has made it necessary for all new communication
equipment installed since 1964or so
to be entirely digital.
Let me try to summarize. In 1963a graduate
student of Carnegie Tech wrote his doctoral dissertation
on technological innovation as exemplified
by the digital computer. He constructed
a functional model.of the computer including
all the important factors affecting performance,
such as speed, memory, input-output.
One of his significant findings was that from
1950 through 1962 the annual rate of improvement
in a computer has been 81 percent for
scientific computation, and 87 percent for commercial
computation. This means that every 3.8
years you improve by a factor of 10. You improve
by two orders of magnitude, a factor of
100, in 7.5 years, and by a factor of a thousand
in 11.5 years.·
Now, it is a widely held belief that, whenever
a te~hnological innovation produces quantitative
Improvements by more than a factor of 10,
It usually has a revolutionary impact on its environment.
Just imagine the effects on our
society if, by 1975, you could buy today’s $50,-
000 house for only $500–or a car which sells
today for $3,000, for 30 bucks! And yet this is
precisely the sort of scale of improveme~t that
we are forecasting for the computer.
Now, of the many attributes of the computer,
the one above all others which makes it really
an extension.of the human mind is its ability to
deal With those mental constructions which we
call models. A model is a description of a physicalor
abstract thing, real or Imaginary. A
,
will give us, I think, the insight of how to control
and possibly modify weather. We may not
have at that time sufficient energy forces to do
this, but I think we will know how. We also
know that developing weather states do not
depend so much on the magnitude of the forces
involved as on the fine balance between such
forces. This fine balance can very easily be upset
by smaller forces, which may be within our
capability to muster. I will leave to your
imagination the consequences of such ability to
modify weather.
But it is the computer’s second role in science
which promises to have the biggest impact. This
is the computer’s use in simulations. About 3
years ago, a husband and wife team, both of
them psychologists, wrote a prog-ram to test the
theory of an eminent psycholofst on the social
behavior of a small group 0 human beings.
Specifically, they were trying to test 10 commandments-
that is, the 10 fundamental lawsin
the theory. Many simulations were run. I
forget now whether some of the laws were confirmed
and some of them were rejected; but the
computer did a remarkable thing. It began to
see certain new behavior patterns developing
which had not even been mentioned in the
theory. These patterns turned out to be a lot
more significant than the laws being tested, and
they were later confirmed by observation of
actuallP”0ups.
This ISthe most significant role that the computer
is playing, not so much as an instrument
but as an actor, a participant, in the development
of new scientific theories. I think the laboratory
as the birthplace of new scientific knowledge
is going to be much out of fashion in the
next 15 or 20 years. It will be replaced by the
computer. In technology and in engineering
this has already taken place. What eng-ineer
today designs things by actual models m the
laboratory f
I started my career by designing filters at the
Bell Telephone Laboratory. That laboratory
no longer exists. Now only the computer
designs filters in communications. I do not
mean that the machine will actually provide
designs. The engineer may provide the initial
matrix and then ask the machine to start filling
in the missing parts. The important thing
would be the ability to·change certain things in
a given design and have the machine be able
to calculate the implications of such changes
throu~hout the design.
Let s look at another field. I think that the
biological and the medical sciences are going
to use computers even more than the physical
sciences. As you know, medical knowledge has
been increasing at an explosive rate, and the
point has long been passed where the useful body
of knowledge in medicine can be remembered
THE FUTURE OF AUTOMATIC DATA PROCESSING 23
,
model can be a description of a military battle,
01′ the energy transformations in a star, or the
biochemistry of a living cell, or a human or’
ganization. Any simulation ‘is an experiment
using a model, but, with a computer simulation,
you can run the experiment as long as you like,
with as many runs as you like, with as many
different sets of conditions as you like. The
computer can have a part in controlling the experiment,
and can interpret the results-all at
very low cost .
.Extrapolating into the future may sound like
science fiction. But I think that the technological
advances which I have discussed will give
us a reasonable basis for expecting certain developments
by 1975. Computers are going to be
inexpensi ve; they are going to be powerful, fast,
rehable, and small in SIze. Large memory banks
for data and ~rograms will be centralized and
will be accessible to a large number of users.
Computers will be time shared. Computers will
communicate with human beings through a rich
va~ety of formats, including graphics and
VOIce. Computers will be easy to use by means
of user-oriented languages.
If you accept these expectations, then let me
suggest a number of thmgs based upon them,
which I think are perfectly possible in the
future. I hope you will not insist that I predict
w~en they will be realized, or even the probablhty
of their realization.
Let me start with science. The computer will
be the most important tool for scientific research,
If anyone doubts this statement, let him
pick ~p any journal of experimental science.
He will see it is principally devoted to the use of
computers in experiments. The computer has
two roles in science. One is as an instrumentfor
exam)?le, as a calculating machine. This is
the traditional role. There are certain
problems in science for which bigger and faster
machines are still needed. One such area is
weather prediction. Even with the most simple
mathematical description of the atmosphere and
of the earth, if we run simulations for weather
prediction with present-day machines, we tax
th~ capability of even the fastest machines in
existence, And, as you know, the predictions
are not always satisfactory. If we want to use
more sophisticated models with the present machines,
we will be predicting tomorrow’s
weather in about a month. The p~edictions may
be correct, but they won’t be very useful.
So we need bigger machines. I believe that in
about 5 01′ 10 years we are going to be able to
solve the problem of weather prediction with a
combination of data from weather satellites and
bigger machines. The benefits are estimated in
the billions of dollars annually, at least in this
country. Furthermore, the knowledge that we
are going to derive about weather prediction
243-9440——67—-4
I
24 PERSPECTIVES IN DEFENSE MANAGEMENT
by a single physician. Why could we not construct
a huge file in a central place containing
the full compendium of medical knowledge?
The doctor, in order to use the file, goes to his
console and feeds in the profile of a given patient–
age, weight, sex, temperature, blood pressure,
and any symptom of illness from sore
throat to unconsciousness. Almost at once the
computer will respond with a list of all
the known diseases that might account for the
symptoms. On the doctor’s instructions, the
machine will list the next logical steps that are
needed to narrow the diagnosis, from X-rays to
a series of chemical tests. When the patient’s
illness is fairly well defined, the computer can
tell the doctor the commonly accepted treatment
for the problem, from aspirin to major surgery.
The doctor, of course, will be free to disregard
this advice. And there will also be room for
argument between the doctor and the computer.
He may have been considering a disease that the
computer did not include in the list. Then the
computer ‘will have to say why it wasn’t
included.
In addition to acting as a diagnostician, the
computer can also be a supereonsultant for purposes
of educating medical students. The student
may ask the computer to form the model
of a patient, and then try out his knowledge and
skill in diagnosis.
Now, the reason I was reading some of this to
you, as some of you may recognize, is that actually
I was not looking ahead 10 years, but I was
reading from a story in The New York Times
of last Sunday. This development has already
been put into operation in hospitals in New
Jersey. After 2 years of research, the program
has already been developed. This is no dream.
I have always been impressed with the
educational opportunities available to military
personnel and people in govermnent. By comparison,
industry does not consciously train employees
for new and higher responsibilities, nor
does it refresh the skills of its leaders. Technological
advance, in part because of the computer,
is so rapid that skills are rapidly becommg
obsolete. I think in the future the criterion
is not going to be alone what level of college
education you have, but how recently you got
your degree. What I would like to suggest is
that the computer in large part can alleVIate the
very problems that it is causing. Already exploratory
projects are showing that the computer
is turning out to be an ideal device for exercising,
instructing, and examining students at
all levels, from grammar school to secondary
education, in diverse subjects from the alphabet
to engineering and science. Because computeraided
instruction is going to accelerate the learning
process, the time it saves will have to be used
in sharpening the intuition and improving
judgment. The use of the computer as an educational
tool does not have to stop at the classroom.
It can be transferred to later professional
life.
Last year Commander Shepard, one of our
first astronauts, said: “During a several-month
interplanetary voyage, crew members would lose
some of the skills they had developed in such
maneuvers as earth reentry. It should not be
difficult to plug a simple simulation device into
the on-board computer required for spacecraft
guidance and navigation.”
I myself am partly involved in interplanetary
flight, and this is precisely what we intend to
do, that is, supply an on-board computer to ~nable
the astronauts on a 2-year voyage to maintain
their skills.
The effect of the comr,uter in education can
be very great. The military officer li~e the
civilian executive should be able to achieve a
level of technical competence much earlier i~ his
career. By combining practical experience
with ready access to instructional material-sthrough
his personal computer, that is-h~ WIll
be able to sustain a high level of pr?fiCle!’cy
and will be ready to adapt to any new situation.
Now, after all this optimistic talk, I must
conclude with a cold shower. The computer IS
the most powerful tool that man has yet devised,
because it is a tool of the mind, rather than
of the body.. Yet at times, when one observes
human behavior, one despairs about the willingness
of man to use his mmd. Man may use well
a tool that he has fashioned, but he may also
abuse it.
No computer can possibly change bad data
into good data, In computer _parlance, we have
an expression for this: GIGO-Garbage In,
Garbage Out. But that is no reason to blame
the tool. The computer cannot possibly replace
man, but, by taking over more and more of hIS
mental chores, it can liberate his mind, his intellect
to push out the frontiers of knowledge,
and more importantly, to reshape and modify
himself.
Thank you very much.
Discussion
QUESTION: Sir, you didn’t say very much
about the future of optical scanning or optical
reading. Will you discuss that, please?
MR. NEHAMA: Yes. This development has
been known for some time. We are talking of
information retrieval on a really g!gantic scale,
for example, taking the Library of Congress and
putting it all into the computer .. ThIS would require
major advances in ophcl;11processmg,
optical scanning, and optical reading.
Right now we have optical devices capable of
say. Earlier in my remarks I intimated that
we have abused computers a great deal. We
have used them for the wrong p’urposes, due to
ignorance both of their capabilities and their
limitations, and we have not fully exploited all
the machines which have been thrown onto the
junk pile.
However, I think that we have gained knowledge
not only in the ability to design and construct
machines, but also the ability to use machines,
in a way that other countries are now
only beginning to approach. So I would say
that, 15or 20 years from now, we will probably
still be far ahead in the computer field.
QUESTION: Do you think we are making
much progress in semantic analysis toward the
day when we will communicate in macrolanguages!
THE FUTURE OF AUTOMATIC DATA PROCESSING 25
tj
reading what is referred to as well formed characters.
Most of the credit cards in use today
have characters which look a little bit stilted,
but are so designed to avoid confusing the computer.
We can expect improvements in these
devices, but nothing really revolutionary.
One item I would like to mention is the ability
of the computer, given a text and instructions,
to directly print a text in a variety of fonts, in
any style you like, with any margin and any
format. I would say that in 10 years the bookstore
as you know it today will not exist. Over
60 percent of the costs in book selling are involved
in inventory, in storing, and in transporting
books. The book of the future may look as
it looks today, but they won’t sell you a copy.
Books will be shown only for browsing. You
will be able to order a book printed in any kind
of font that you like, on any kind of paper, and
then come back in an hour or so and pick it up.
QUESTION: Computers are now used in
politics to predict election results on the basis
of a limited sample. What future do you see
for the computer in national and international
politics!
MR. NEHAMA: I spoke earlier about the
computer’s ability to construct models. Many
people have speculated about warfare of the future
being, not an exchange in the delivery of
weapons, but in pitting one’s model against the
enemy’s. We should be able some day to construct
ve’]: accurately the model of, let’s say,
an enemy s economy, and knowing from the
model the structure of his economy, then we
would know how to attack it. I can imagine
similarly, that in international politics if we
know accurately another nation’s political,
social, and economic strengths and weaknesses,
we might be able to negotiate and bargain more
effectively.
Recently I saw an article with a big headline:
“LABOR UNION TURNS TO COMPUTER
FOR FUTURE BARGAINING WITH
COMPANIES.” To the extent that there may
be this kind of interplay in international affairs,
I can see a role for the machine. But this is all
very speculative.
QUESTION: Where do you think we will
be in the future in the United States, as opposed
to some other countries, in the computer field!
MR. NEHAMA: I can only extrapolate.
There is every reason to believe that the exponential
rate of increase in the last 15 years in
this country is going to continue. By comparison
the total machine population in the rest of
of the world is less than 5 percent of what it is
in this country. Whether we will maintain
our momentum in a qualitative sense, I can’t
MR. NEHAMA: I don’t think we are. A
great deal of work is being done, but mechanical
translation today is a dismal failure. Remember
that human language has taken millions of
years to develop, and that semantic analysis involvesthe
distinctive cultural concepts, psychology,
and the mores of a people. These are the
difficulties that we don’t know yet how to solve.
QUESTION: You see computers coming into
our economy and into our personal life.
Aren’t there dangers in becoming too dependent
on the machine f I’m thinking, for example,
of the power failure in the Northeast.
MR. NEHAMA: Well, I doubt whether we
are likely to become too dependent. Recently
we were asked to see whether we could supply
the Director of the Apollo Project with a computerized
s’ystem so that he could get the status
of the project at his fingertips. Well, in the
Apollo Project changes are occurring so fast
that, unless they could be immediately communicated
to the computer, it would be positively
dangerous to give a decision maker that
kind of tool, because it would lull him into a
false sense of security. He would punch a button
to ask how the spacecraft was coming, and
he would get completely stale information.
So what we suggested was a glorified telephone
directory. If the director wants to find
out how the spacecraft is doing, he would get
from the computer the name of the man in
charge of the spacecraft and his telephone number,
and the name of the man who may replace
him. He calls him up. This is also a self-correcting
system, because, if the man in charge
doesn’t know, then he will be fired.
In short, our reliance on the computer is still
far from total, and I do not think that it ever
will be.

The Supreme Court’s Decision on the Affordable Care Act

I was surprised to learn that Chief Justice Roberts joined with 4 others in a majority opinion that the act would pass constitutional muster.   Some of my previous messages were about my belief that the law would be declared unconstitutional on grounds that it violated the First Amendment of the U.S. Constitution clause about freedom of speech.  The case of United States v. United Foods,  about the section of the 1990 farm law known as the Mushroom Promotion, Research, and Consumer Information Act, declared to be unconstitutional for violating the First Amendment was my bet as to the part to be used.

Remember the word “tax”.  I believe that legislation adjusting provisions of the income tax laws get more attention in Congress than any other type of law except for the annual appropriations laws.  I think that the tax provisions of the Affordable Care Act will be made to become so cumbersome and complicated that medical service providers will begin to opt out of providing service to minimize the need to include this in their income tax returns.

It may become like the Foreign Tax Credit provisions IRS Form 1116 for taxes paid to foreign countries on dividends and capital gains of foreign stocks.   It costs me money to have my accountant do the complicated calculations to reduce my federal income tax by some small amount.  Another possibility is that medical service providers will reincorporate themselves in ways that allow them to pass on the costs of complying with federal laws and regulations to the patients and to their insurers.   Instead of declining to see patients whose medical insurance coverage is medicare or medicaid, medical service providers will find ways to discourage patients because the additional income tax return burden on the patient will encourage them to seek help elsewhere.    Or, it may cause insurers and self-insurers to say, “you have to pay extra if you go to X for service”.  A bill for medical service will begin to look like the charges for having a bank account, or flying on an airplane.

I am sure that there are some ambitious people out there working on seminars of this type:   How to Increase Your Practice Revenue through the Affordable Care Act.