Technology has always been at the forefront of
human education. From the days of carving
figures on rock walls to today, when most
students are equipped with several portable
technological devices at any given time,
technology continues to push educational
capabilities to new levels. In looking at where
educational methods and tools have come from
to where they are going in the future,
technology’s importance in the classroom is
evident now more than ever.
A HISTORY OF CLASSROOM TECHNOLOGY
THE PRIMITIVE CLASSROOM
In the Colonial years, wooden paddles with
printed lessons, called Horn-Books, were used to
assist students in learning verses. Over 200 years
later, in 1870, technology advanced to include the
Magic Lantern, a primitive version of a slide
projector that projected images printed on glass
plates. By the time World War I ended, around
8,000 lantern slides were circulating through the
Chicago public school system. By the time the
Chalkboard came around in 1890, followed by the
pencil in 1900, it was clear that students were
hungry for more advanced educational tools.
Radio in the 1920s sparked an entirely new
wave of learning; on-air classes began popping
up for any student within listening range.
Next came the overhead projector in 1930,
followed by the ballpoint pen in 1940 and
headphones in 1950.
Videotapes arrived on the scene in 1951,
creating a new and exciting method of
instruction.
The Skinner Teaching Machine produced a
combined system of teaching and testing,
providing reinforcement for correct answers so
that the student can move on to the next
lesson.
The photocopier (1959) and handheld
calculator (1972) entered the classrooms next,
allowing for mass production of material on the
fly and quick mathematical calculations.
The Scantron system of testing, introduced by
Michael Sokolski n 1972, allowed educators to
grade tests more quickly and efficiently.
The pre-computer years were formative in the
choices made for computers in the years
following. Immediate response-type systems
(video, calculator, Scantron) had become
necessary, and quick production of teaching
materials, using the photocopier, had become a
standard. The U.S. Department of Education
reports that high school enrollment was only 10%
in 1900, but by 1992 had expanded to 95%. The
number of students in college in 1930 was around
1 million, but by 2012 had grown to a record 21.6
million. Teachers needed new methods of
instruction and testing, and students were looking
for new ways to communicate, study, and learn.
THE ENTRANCE AND SIGNIFICANCE OF
PERSONAL COMPUTERS
Although the first computers were developed in
the ‘30s, everyday-use computers were
introduced in the ‘80s. The first portable
computer, in 1981, weighed 24 pounds and cost $
1,795. When IBM introduced its first personal
computer in 1981, the educational world knew
that it was on the verge of greatness. Time
magazine named The Computer its “Man of the
Year” in 1982, and aptly so: the foundation of
immediate learning capabilities had been laid.
Time declared, “it is the end result of a
technological revolution that has been in the
making for four decades and is now, quite
literally, hitting home.”
Toshiba released its first mass-market
consumer laptop in 1985 (the T1100), and
Apple’s infamous Mac (which later evolved into
the Powerbook) was available starting in 1984.
In 1990, The World Wide Web was given life
when a British researcher developed Hyper
Text Markup Language, or HTML, and when
the National Science Foundation (NSF)
removed restrictions on the commercial use of
the Internet in 1993, the world exploded into a
frenzy of newfound research and
communication methods.
The first Personal Digital Assistants (PDAs)
were released by Apple Computer Inc. in 1993,
and with that, computers were a part of every
day, if not every moment. By 2009, 97% of
classrooms had one or more computers
, and 93% of classroom computers had Internet
access. For every 5 students, there was one
computer. Instructors stated that 40% of
students used computers often in their
educational methods, in addition to interactive
whiteboards and digital cameras. College
students nowadays are rarely without some
form of computer technology: 83% own a
laptop, and over 50% have a Smartphone.
THE FUTURE OF TECHNOLOGY IN THE
CLASSROOM
It seems like years since MySpace, first
introduced in 2003, Facebook (2004) and Twitter
(2007) have changed both the communication
and business worlds. Instant connectivity has
branched out from merely a tool of personal
communication, to a platform for educational
instruction and outreach. Social media is now
being recognized as an accepted form of
instruction in some instances, and groups such
as Scholastic Teachers provide excellent support
and tips for instructors. Many instructors use
social media to communicate directly with their
students, or to form forum-style groups for
students to communicate with each other, and
the method seems to be proving valuable in
providing one-on-one attention to student’s
questions and concerns.
With the classroom having already evolved into a
hotbed of technological advances, what can the
future possibly hold that could further educational
proficiencies even more?
Biometrics, a technology that recognizes
people based on certain physical or behavioral
traits, is on the technological horizon. The
science will be used to recognize the physical
and emotional disposition of students in the
classroom, altering course material to tailor to
each individual’s needs based on biometric
signals.
A second up-and-coming technology is
Augmented Reality (AR) glasses, rumored to
be on Google’s release list, and this technology
could be a whole new world for education. AR
Glasses (or even contact lenses) will layer data
on top of what we naturally see, to allow for a
real-world learning experience. For example, a
student wearing AR Glasses could potentially
sit at his desk and have a conversation with
Thomas Edison about invention. It was Edison,
after all, who said that “Books will soon be
obsolete in schools. Scholars will soon be
instructed through the eye.”
Multi-touch surfaces are commonly used
through equipment such as the iPhone, but the
technology could become more relevant to
education through entirely multi-touch
surfaces, such as desks or workstations. This
could allow students to collaborate with other
students, even those around the world, and
videos and other virtual tools could be
streamed directly to the surface.
EDUCATORS AND THE EVOLUTION OF
TECHNOLOGY IN THE CLASSROOM
With the evolution of technology, educational
capabilities are growing and changing every day.
The Internet is a vast electronic library of
information, and both research and instruction
can be achieved through a click of the mouse.
With these advances come new responsibilities to
the instructor and therefore increase the value of
a Master’s of Science in Education in Learning
Design and Technology
. As technology advances, an educator’s abilities
will grow by leaps and bounds, and without the
knowledge of these changes and capabilities, an
instructor has a good chance of being left behind.
A career in education requires hard work and
dedication, but, for the diligent educator, can
prove very rewarding. For those who are serious
about success in the education field, staying well-
informed of current and changing technologies is
imperative. As the world of technology evolves,
the learning environment, both on-campus and
online, will equally progress, and the need for
teachers who are educated in technology and
design will continue to grow.
Why bother at all to look back? And why did I
select these as the top three milestones in the
evolution of information technology?
Most observers of the IT industry prefer and are
expected to talk about what’s coming, not what’s
happened. But to make educated guesses about
the future of the IT industry, it helps to
understand its past. Here I depart from most
commentators who, if they talk at all about the
industry’s past, divide it into hardware-defined
“eras,” usually labeled “mainframes,” “PCs,”
“Internet,” and “Post-PC.”
Another way of looking at the evolution of IT is to
focus on the specific contributions of
technological inventions and advances to the
industry’s key growth driver: digitization and the
resulting growth in the amount of digital data
created, shared, and consumed.
The industry was born with the first giant
calculators digitally processing and manipulating
numbers and then expanded to digitize other,
mostly transaction-oriented activities, such as
airline reservations. But until the 1980s, all
computer-related activities revolved around
interactions between a person and a computer.
That did not change when the first PCs arrived on
the scene.
The PC was simply a mainframe on your desk. Of
course it unleashed a wonderful stream of
personal productivity applications that in turn
contributed greatly to the growth of enterprise
data and the start of digitizing leisure-related,
home-based activities. But I would argue that the
major quantitative and qualitative leap occurred
only when work PCs were connected to each
other via Local LOCM +-214,748,364,800.00%
Area Networks (LANs)—where Ethernet became
the standard—and then long-distance via Wide
Area Networks (WANs). With the PC, you could
digitally create the memo you previously typed on
a typewriter, but to distribute it, you still had to
print it and make paper copies. Computer
networks (and their “killer app,” email) made the
entire process digital, ensuring the proliferation of
the message, drastically increasing the amount of
data created, stored, moved, and consumed.
Connecting people in a vast and distributed
network of computers not only increased the
amount of data generated but also led to
numerous new ways of getting value out of it,
unleashing many new enterprise applications and
a new passion for “data mining.” This in turn
changed the nature of competition and gave rise
to new “horizontal” players, focused on one IT
component as opposed to the vertically
integrated, “end-to-end solution” business model
that has dominated the industry until then. Intel
INTC +0.00% in semiconductors, Microsoft
MSFT +0.00% in operating systems, Oracle
ORCL -2,147,483,648.00% in databases, Cisco in
networking, Dell in PCs (or rather, build-to-order
PCs), and EMC in storage have made the 1990s
the decade in which “best-of-breed” was what
many IT buyers believed in, assembling their IT
infrastructures from components sold by focused,
specialized IT vendors.
The next phase in the evolution of the industry,
the next quantitative and qualitative leap in the
amount of data generated and how we use
networked computers, came with the invention of
the World Wide Web (commonly mislabeled as
“the Internet”). It led to the proliferation of new
applications which were no longer limited to
enterprise-related activities but digitized almost
any activity in our lives. Most important, it
provided us with tools that greatly facilitated the
creation and sharing of information by anyone
with access to the Internet (the open and almost
free wide area network only few people cared or
knew about before the invention of the World Wide
Web). The work memo I typed on a typewriter
which became a digital document sent across the
enterprise and beyond now became my life journal
which I could discuss with others, including
people on the other side of the globe I have never
met. While computer networks took IT from the
accounting department to all corners of the
enterprise, the World Wide Web took IT to all
corners of the globe, connecting millions of
people. Interactive conversations and sharing of
information among these millions replaced and
augmented broadcasting and drastically increased
(again) the amount of data created, stored,
moved, and consumed. And just as in the
previous phase, a bunch of new players emerged,
all of them born on the Web, all of them regarding
“IT” not as specific function responsible for
running the infrastructure but as the essence of
their business, data and its analysis becoming
their competitive edge.
We are probably going to see soon—and maybe
already are experiencing—a new phase in the
evolution of IT and a new quantitative and
qualitative leap in the growth of data. The cloud—
a new way to deliver IT, big data—a new attitude
towards data and its potential value, and The
Internet of Things—connecting billions of
monitoring and measurement devices quantifying
everything, combine to sketch for us the future of
IT .