3.2 Discussion: Putting Ideas Together

3.2 Discussion: Putting Ideas Together

3.2 Discussion: Putting Ideas Together

Number of replies: 65
Share your "It Says, I Say, And So" responses on the discussion forum. What answers did your classmates have? How were their responses different or similar to yours? If you'd like, respond to your classmates' posts.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Lailaturahma Binar Risky Maharani -
Originally, computers were conceived onyl as a calculating devices. The first calculating devices weere developed during World War II, and since then, the power, speed and versatility of computers have been increasing as it led to Digital Revolution. Nowadays, computers are not only for calculating, but aslo any other activity, such as typing, drawing, making projects. As the rapid growth of technology, computers also can be found any where around us and it's not exagerrated to say that computers is basic needs for those who move forward.
In reply to Lailaturahma Binar Risky Maharani

Re: 3.2 Discussion: Putting Ideas Together

by Sayed Fozele Uddin Rafi -
Computers, initially conceived as calculating devices and aided by simple manual tools like the abacus, have significantly evolved over time. Mechanical and electrical devices during the Industrial Revolution automated tasks, while sophisticated electronic calculating machines emerged during World War II. Semiconductor breakthroughs, including the introduction of transistors and MOSFET technologies, propelled the development of microprocessors and integrated circuits, marking the onset of the microcomputer revolution. This rapid progress, predicted by Moore's Law, led to exponential growth in computing power, enabling the Digital Revolution from the late 20th to early 21st centuries.

Overall, the trajectory of computers showcases a continuous journey marked by innovations in technology, particularly in semiconductors, driving exponential improvements in speed, power, and versatility, fundamentally transforming computing capabilities and revolutionizing the world
In reply to Lailaturahma Binar Risky Maharani

Re: 3.2 Discussion: Putting Ideas Together

by Wilfredo Polo Tirado -
Information and Communication Technologies (ICT) have evolved constantly and Artificial Intelligence (AI) is the trend nowadays so computers have changed in the last century, too.
In reply to Lailaturahma Binar Risky Maharani

Re: 3.2 Discussion: Putting Ideas Together

by ABEL TESFAHUN KUMSA -
The evolution of computers began with their original concept as calculating devices during World War II. Over time, their power, speed, and versatility increased, leading to the Digital Revolution. Nowadays, computers serve not only for calculations but also for a wide range of activities like typing, drawing, and project-making. The ubiquitous presence of computers in our lives has made them essential tools for modern living.
Reflecting on this progression, it is fascinating to witness how computers have transcended their original purpose. They have become indispensable in our daily lives, enabling us to communicate, create, innovate, and connect in ways that were once unimaginable. As technology continues to advance, the role of computers will likely expand further, shaping the way we work, learn, and interact with the world around us.
Looking ahead, the integration of computers into every aspect of modern life underscores their significance as a fundamental necessity for individuals progressing in today's society. As we embrace the evolving capabilities of technology, we are poised to witness even more profound changes in how computers influence our lives, driving innovation, efficiency, and connectivity to new heights.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Wilfredo Polo Tirado -
The author said there are different types of devices: manual, mechanical, and digital. Thus, computers are used for calculations mostly. He also said that calculations have had different stages: from analog to digital. Computers have evolved from manual devices to microcomputers with microprocessors. People can get a computer not only by its size but also by its features and its prices.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Ambo Upe -

INFORMATION LITERACY

An article says that information literacy is the ability to find, evaluate, organize, use and communicate information in all its various formats, most notably in situation requiring decision making, problem solving or the acquisition of knowledge.

I think the ability to find and use the information are both the most important parts of information literacy. I'm myself make every endeavor to find information that I need to solve a problem of mine in my life.

So there are two main elements in information literacy. First is ability to find and use the information, second is the function of the information itself, to solve a problem in life. That all.

In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Temesgen mr. -
Early computers as calculating devices then in early in the Industrial Revolution, some mechanical devices were built and then more sophisticated electrical machines with The speed, power and versatility in place. The advancement of computer is going through processes and concerning the computer advancement the speed, power and versatility is improved and typically in early 20th century more sophisticated electrical machines were emerge
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Ataklti.zereu@mu.edu.et Neizgi -
The three important I read from the passage about the computer are: Initially, like computer device call abacus was used to help people in calculation. The second, mechanical computer was also used to help people in looming. Third, like contemporary computer was used first time during second world war. From that time on ward the advancement of it keeping unpredictably, I say also that computer will handover most human services soon.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Elizabeth Escaño -
Computers were initially made to serve as computing devices. The first digital electronic calculating machines were developed during World War II. Computers have become even more powerful as the years went by.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Mouhamed Jalil EWOLO NKEN -

The computers are devices that was made firstly to accomplish calculations, then between the years they have developed. They became more efficient to accomplish more tedious tasks and specialized tasks in the automated way. Their hardware have also changed and developed through the years and centuries. 

In reply to Mouhamed Jalil EWOLO NKEN

Re: 3.2 Discussion: Putting Ideas Together

by DANIEL OJONYE -
Early computers were used as calculating devices then during the Industrial Revolution, some mechanical devices were built and then more sophisticated electrical machines with the speed, power and versatility in place. The advancement of computer is going through revolution to create the advancement in speed, power and versatility. This is why is in the early 20th century more sophisticated electrical machines were produced.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Junlord Ray Rosete -
Three important things about computer: They served as Calculating device, they are used to automate long tedious tasks, such as guiding patterns for looms and they were used in all stages of the Revolution.

What three things did you already know about computers: Calculation, The first digital electronic calculating machines were developed during World War II and During the ancient times, simple manual devices like the abacus aided people in doing calculations.

It says that the Early computers were only conceived as calculating devices. Since ancient times, simple manual devices like the abacus aided people in doing calculations. I say that computers are very helpful machine or device during the analog or old and modern or digital time. And so, the evolution and history of computer from the early Industrial Revolution up to the Digital Revolution still plays an important role in our generation and society.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by LOKMANE BENBOURAS -

Computers, integral to modern life, are electronic devices that process data to perform various tasks. They consist of hardware components such as the central processing unit (CPU), memory, and storage devices. These components work in tandem to execute instructions and store information. The evolution of computers has been marked by a relentless pursuit of smaller size, increased speed, and enhanced capabilities. From early room-sized mainframes to today's sleek laptops and powerful desktops, computers have become ubiquitous tools for communication, work, entertainment, and problem-solving. Their impact spans across industries, shaping the way we live, work, and connect in the digital age.

In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by MAHAMMADU HANIPA AFNAN -
Information and Communication Technologies (ICT) have evolved constantly and Artificial Intelligence (AI) is the trend nowadays so computers have changed in the last century, too.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Maya Batkunda -
Early, computers were used as calculating device. As time goes by, It developed even more increasingly due to the technology revolution, computers has been expanded in terms of speed, power and versatility. In this digitalizazion era, computer has many functions. It can be used to type, work on a data, paint, draw, play game / music, watch videos and there are still more function that can be mentioned. So, computer has become a device with multiple tasks and programs that was designed to help humans do their work.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Zyd Genes Albarico -
computers are developed as the years went by.
In reply to Zyd Genes Albarico

Re: 3.2 Discussion: Putting Ideas Together

by Chala Usmael -

It says:During ancient times computers were used as calculating machines only. 

I say:at that time most people couldn't afford it due to its cost-effectiveness.

And so through time process it became an affordable and versatile machine. 

In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Bunga Ayu Januarita -
Early computers were only conceived as calculating devices. Nowadays, we can use computers to do many other things like playing games and designing.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Negede Kelemework -
Computers, initially conceived as calculating devices and aided by simple manual tools like the abacus, have significantly evolved over time. Mechanical and electrical devices during the Industrial Revolution automated tasks, while sophisticated electronic calculating machines emerged during World War II. Semiconductor breakthroughs, including the introduction of transistors and MOSFET technologies, propelled the development of microprocessors and integrated circuits, marking the onset of the microcomputer revolution. This rapid progress, predicted by Moore's Law, led to exponential growth in computing power, enabling the Digital Revolution from the late 20th to early 21st centuries.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Shrestha Ashok -
Computers were used as a calculating device in early days. Later on its design, power and performances upgraded according to the time. From Abacus to personal computer of today has seen many changes. We are living in the era of artificial intelligence and all things made possible by computer only.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Kashish Vohra -
Originally, computers were conceived onyl as a calculating devices. The first calculating devices weere developed during World War II, and since then, the power, speed and versatility of computers have been increasing as it led to Digital Revolution. Nowadays, computers are not only for calculating, but aslo any other activity, such as typing, drawing, making projects. As the rapid growth of technology, computers also can be found any where around us and it's not exagerrated to say that computers is basic needs for those who move forward.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Raj Kumar Ghale -
The author said there are different types of devices: manual, mechanical, and digital. Thus, computers are used for calculations mostly. He also said that calculations have had different stages: from analog to digital. Computers have evolved from manual devices to microcomputers with microprocessors. People can get a computer not only by its size but also by its features and its prices.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by tharaphi bo -
Computers at fist conceived as a calculating machines in ancient times
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Decoyna, Juanita Kane Sangao. -
It Says:
- Computers evolved from simple calculating devices aided by tools like the abacus.
- Mechanical and electrical devices during the Industrial Revolution automated tasks.
- Electronic calculating machines emerged during World War II.
- Semiconductor breakthroughs, including transistors and MOSFET technologies, led to microprocessors and integrated circuits.
- Moore's Law predicted exponential growth in computing power.
- The Digital Revolution occurred from the late 20th to early 21st centuries.

I Say:
The evolution of computers has been driven by advancements in technology, particularly in semiconductors. These advancements have led to significant improvements in computing power, speed, and versatility, fundamentally transforming the capabilities of computers and revolutionizing various aspects of our world.

And So:
Computers have become indispensable tools in modern society, impacting industries, communication, and innovation. The continuous advancement in technology will likely lead to further transformative changes in the future.

Regarding my classmates' responses, I observed some similarities and differences. Many of them highlighted the importance of semiconductor technology and Moore's Law in driving the evolution of computers, which aligns with my response. However, some classmates emphasized different aspects, such as the role of software development or the impact of computers on specific industries. It was interesting to see the diverse perspectives on the topic.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Salwa Haboubi -
since man had desovred the importance of the computers their development will never stop moreover with the artificial inteligence computers became more useful and many fields such as acadamic reserches and scientices reportes depended on them
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Salwa Haboubi -
computers are in non stop poreccess of devlopement since man had discovered that they are very useful and important for any field.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Maw Byar Myar -
Computer are really useful for all people. But some people can't access because it's very pricey.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Orange Mw -
Computers have come a long way, evolving from basic calculating devices to powerful digital machines. This evolution was made possible by advancements in technology, especially in electronics and semiconductors. As we know, the rapid growth of computing power played a significant role. As a result, computers have become essential in modern life, revolutionizing how we work, communicate, and live.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Omar Sayed -
It Says, I Say, And So" response:

It Says: A recent study suggests that rising technology access correlates with a decrease in reading comprehension among students.
I Say: While the study highlights a potential correlation, it's important to consider other factors that might be influencing reading comprehension, such as changes in educational methods or the types of content students are reading.
And So: Further research is needed to determine if there's a causal relationship between technology and reading comprehension. We should also explore ways to leverage technology to enhance, not hinder, reading skills.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Rania Afzal -
it says that computers are semiconductortransistors having great speed, power and verstality.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Nooria Nizrabi -

I knew that computer was known as calculation machine. And the word computer comes from word comput means calculations. I learn the history of computer from the article. Early the computer was used as calculation machine but during second World War industrial revolution the micro computers also developed and in 1980s the processers and voltage of computer increased. Now people use computer for writing searching data from Google and calculation but with modern features 

In reply to First post

This forum post has been removed

The content of this forum post has been removed and can no longer be accessed.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Hussain Altrshan -
In the beginning, computers were primarily seen as tools for performing calculations. The initial development of calculating devices occurred during World War II. Subsequently, there has been a significant enhancement in the power, speed, and adaptability of computers, catalyzing the Digital Revolution. Today, computers serve not only for calculations but also for a multitude of other tasks like typing, drawing, and project creation. With the rapid advancement of technology, computers have become omnipresent, and it is fair to assert that they are essential tools for individuals advancing in the modern world.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Lujin Mukhana -
computers were supposed to be only a calculating devices. The first calculating devices were developed during World War II ,then the power, speed of computers have been increasing as it led to Digital Revolution. But now , computers are not only for calculating, but for working , drawing, making projects. As the rapid growth of technology, computers also can be found any where around us and it's not exaggerated to say that computers is basic needs for those who move forward.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Alex Merkulieva -

Early computers were initially conceived as calculating devices. The production of integrated circuit chip technology and MOS transistors in the late 1950s led to the development of microprocessors and microcomputers in the 1970s. This technological advancement caused a revolution that has been increasing the speed, power, and versatility of computers ever since, leading to the Digital Revolution. Nowadays, computers are one of the main devices that empower every field of knowledge, including valuable scientific research.

In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Sri Dewi Sukma Ayu -
The text discusses the historical evolution of computers, from their origins as calculating devices to their role in the Digital Revolution. Computers have indeed undergone significant transformations throughout history, evolving from simple manual devices like the abacus to sophisticated digital machines. This evolution has been driven by key technological advancements, such as the development of digital electronic calculating machines during World War II and the subsequent rise of semiconductor technology. These advancements have led to the Digital Revolution, characterized by exponential growth in computing power and the widespread integration of computers into various aspects of modern life. Understanding the historical context and technological developments behind computers helps to appreciate their profound impact on society and the continuous innovation driving their evolution. As we move forward, it's crucial to recognize the role of computers in shaping our world and to anticipate further advancements that will shape the future of technology and human civilization.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Khinemar Myint -

The author mentioned that how the computers become started important in our daily life and different kinds of computers 

Furthermore,it is said that how the computers become sophisticated day by day

In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Nurane Seyidzade -

I don't have access to external discussion forums or classmates' responses. However, I can help you generate "It Says, I Say, And So" responses based on a given text or topic if you provide the necessary information. If you have specific questions or need assistance with a particular task related to generating responses or analyzing a text, feel free to ask!

In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Nadeem Sumair -
We talked about how computers have changed from just doing math to now being used for lots of things like typing and drawing. They've gotten better with new technology like transistors and Moore's Law.

I agree that computers have come a long way. It's amazing how they're now such a big part of our lives, helping us with so many tasks.

Looking ahead, it's clear that computers will keep playing a big role in our lives. With technology always improving, we can expect even more changes in the future.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Pape Charles NDIAYE -
It say initially computer were used only for calculating
I say computers have evolution and help human being to do many things like data analysis, AI, and sophestical calculations
and so computer is nowaday an devices very useful who help us in our job
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Yuzi Melia Adi Putri -
I say computers were initially conceived as calculating devices, with the earliest tools like the abacus aiding in manual calculations.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Dehendji Alaeddine -
computer in the first was a counting device . Advancement of technology enabled ever_more complex computers by the 20 th century
and computers became larger and more powerful
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Karina Mariscal -
Computers were conceived only as calculating devices.
Nowadays computers are faster and easy to use.
Computers have evolved significantly.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Khairullah Rezaie -
The very early computers were supposed to be barely a calculating devices. Throughout the long journey from its emergence to time being, the computers have evolved dramatically. The very first sophisticated calculating devices started to appear during the world war second. Since then, thanks to the mankind innovations, the speed, power, and versatility has been consistently upgrading which has resulted in the modern digital age. At the ancient time, for storing a 256 megabyte of information, a huge device, which could barely placed in an empty room, was needed. however, the new micro chips can store hundreds of gigabytes of information in such a tiny peace of substance. In the modern digital age, even thought of life without computer is a pain in the neck. Nowadays, you could rarely think of an aspect of life that is not being impacted by computers, and it is a comprehensive takeover by computers, indeed. Hence, it is not something far that they become alternatives for human workforce.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Khairullah Rezaie -
The very early computers were supposed to be simple calculating devices which assisted individuals in basic concepts of countings. The more sophisticated computers emerged by the time of the World War Second. Since then, it has been evolving dramatically. The modern computers are far faster, stronger and more versatile than it ever was. In the ancient times, when storing a 256 Megabyte of information, you needed a device in the size of a room, but the evolution of technology now made it possible to store that much information in a bean sized substance. The overtake of computers is beyond the discussions, in the contemporary age, it far beyond the imagination to think of world without computers. In the time being, computers are capable of accomplishing almost any task. Hence, it is not out of reach for computers to completely substitute the human workforce.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Joselito Jimenez -
The passage emphasizes how Industrial Revolution evolve. I can say that we can scrutinize and make inference of how these things started and improved as the years gone by. For how many decades have passed, many inventors have been trying to make innovations and developed a new technology that can be helpful in our daily task to make our life easier.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Meraj Panahi -
computers are something we encounter with on a daily basis. Nowadays we do everything with computer and other digital devices. The development of the computers during 20th and 21st century was so rapid because companies needed to get everything done in a small amount of time. On the other hand people needed devices that they carry them out so it led the way to the microcomputers and other digital devices.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Deleted user -
Computers, originally designed for mathematical calculations and supported by basic manual instruments like the abacus, have undergone substantial transformations throughout history. During the Industrial Revolution, mechanical and electrical devices were created to automate tasks, while more advanced electronic calculating machines emerged during World War II. Breakthroughs in semiconductor technology, such as the invention of transistors and MOSFETs, accelerated the development of microprocessors and integrated circuits, which marked the beginning of the microcomputer era. This rapid advancement, as foreseen by Moore's Law, resulted in a remarkable increase in computing capabilities, facilitating the Digital Revolution from the latter part of the 20th century to the early years of the 21st century.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Ilaria Auditore -
PCs’ softwares and hardwares alike have profoundly changed form its creation in the 20th century since today. They upgraded more and more until the portable laptop that we know nowadays. The digital revolution, though, is still happening, and it will continue to advance with better technology each days. 

In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Aurum Ko -
It says the computer were only used as calculators in ancient times. However, its speed, power and versality has increased dramatically since the microcomputer revolution. Computers became more useful in digital revolution. Nowadays, cmputer are used in many fields of modern life such as education, health care and industry.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Alessia De Martino -
At the beginning computers were simple machines only conceived as calculating devices. The first digital electronic calculating machines ever created during World War II were very small and only composed by a few components. The technologies used to produce them increased dramatically after the so called Digital Revolution who took place during the late 20th to early 21st centuries. A build-in TV screen, impressive graphics, and additional components were created and adopted by all the computers produced. After the 1980s they started to be used not only for work purposes but also for personal productivity, programming and games.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Goitom Desaley -
effectively putting ideas together involves ensuring coherence, organization, clarity, and audience consideration. This includes using transitions, signposts, and precise language, as well as revising and editing for improvement.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Marissa Gobbo -
Early computers were only conceived as calculating devices, early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks. The first digital electronic calculating machines were developed during World War II. Today a computer is an electronic device that manipulates information, or data. It has the ability to store, retrieve, and process data. They were updated during the late 20th and the early 21st century
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Andrea Ambrosino -
Thanks to the technological and scientific advancements, the computers have increased their performance and have optimised their size over the last century.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Esha Munir -
Computers, initially conceived as calculating devices and aided by simple manual tools like the abacus, have significantly evolved over time. Mechanical and electrical devices during the Industrial Revolution automated tasks, while sophisticated electronic calculating machines emerged during World War II. Semiconductor breakthroughs, including the introduction of transistors and MOSFET technologies, propelled the development of microprocessors and integrated circuits, marking the onset of the microcomputer revolution. This rapid progress, predicted by Moore's Law, led to exponential growth in computing power, enabling the Digital Revolution from the late 20th to early 21st centuries.

Overall, the trajectory of computers showcases a continuous journey marked by innovations in technology, particularly in semiconductors, driving exponential improvements in speed, power, and versatility, fundamentally transforming computing capabilities and revolutionizing the world
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Shiela Maree Pino -
**It Says:**
The reading discusses the evolution of computers, highlighting their progression from simple calculating devices to sophisticated machines capable of performing various tasks. It mentions key technological advancements, such as the invention of transistors and integrated circuits, which have played a crucial role in shaping the development of computers. Additionally, the reading emphasizes the profound impact of computers on society, revolutionizing industries, communication, and daily life.

**I Say:**
I found the discussion about the evolution of computers particularly fascinating. It's incredible to think about how far computers have come since their inception, and the role that technological advancements have played in driving this progress. Additionally, the reading reaffirmed my understanding of the significant impact that computers have had on various aspects of modern life. From transforming industries to facilitating communication and enhancing productivity, computers have truly revolutionized the way we live and work.

**And So:**
Reflecting on the insights from the reading and my own knowledge, it's clear that computers will continue to play a central role in shaping the future. As technology continues to advance, we can expect computers to become even more powerful, versatile, and integrated into our daily lives. It's exciting to think about the possibilities that lie ahead as we harness the potential of computing technology to address complex challenges and drive further innovation.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Maksym Bondarenko -
It Says:
Computers are not to blame. They offer a wide range of benefits, from entertainment and knowledge to inspiration. They have revolutionized how we work and interact, making tasks easier and more efficient.
I Say:
Positive Aspects:
Computers have enabled incredible achievements in various fields, from scientific breakthroughs to artistic creations.
They empower us by providing access to information and resources that were once inaccessible.
They serve as tools for creativity and innovation, sparking new ideas and possibilities.
Computers have connected people across the globe, fostering communication and collaboration on a scale never seen before.
Negative Aspects:
Despite their benefits, computers also pose risks, such as cybersecurity threats and privacy concerns.
They can contribute to social isolation and addiction, as people spend increasing amounts of time in front of screens.
Automation driven by computers has led to job displacement and economic inequality in some sectors.
There's also the risk of overreliance on technology, which can hinder critical thinking and problem-solving skills.
And So:
While computers have their drawbacks, there's hope that we can harness their potential for good.
As technology continues to advance, there's the possibility of creating more secure and ethical computing systems.
Education and awareness are key in ensuring that people use computers responsibly and ethically.
Ultimately, it's up to us to ensure that computers are used as tools for progress and empowerment rather than instruments of harm.
In conclusion, computers are neither inherently good nor bad; it's how we choose to use them that determines their impact on society. By recognizing their potential and addressing their challenges, we can navigate the digital age with wisdom and foresight.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Lara Della Torre -
It Says: Early computers were mechanical devices used for tasks like guiding patterns for looms.
I Say: The transition to digital electronic calculating machines during World War II laid the groundwork for modern computing.
And So: The continuous advancement of technology has led to the exponential growth of computing power and capabilities.
My classmates had similar responses, emphasizing the evolution of computers from simple tools to complex digital machines. Some focused on specific milestones, like the invention of the microprocessor, while others discussed the broader impact of computing on society. Overall, our responses were aligned in recognizing the transformative nature of technological advancements in the field of computing.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Gebreamlak Desta -
It Says: Early computers were initially seen as calculating devices, with simple manual tools like the abacus and mechanical devices aiding in calculations since ancient times
I Say: The progression from analog to digital computing marked a significant shift in the capabilities and speed of computers.
And So: The continuous innovation in computer technology, driven by Moore's law and advancements in semiconductor technology, has transformed the way we live and work.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by momar fall -
Originally, computers were conceived only as a calculating devices. The first calculating devices were developed during World War II, and since then, the power, speed and versatility of computers have been increase as it led to Digital Revolution. Nowadays, computers are not only for calculating, but aslo creating application and automate task. As the rapid growth of technology, computers also can be found any where around the world.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Abegail Bishop -
It says that computer are mainly use as calculator back in the day. I say computers are now very modern can literallu do anything. So it makes our life easier.
In reply to First post

Re: 3.2 Discussion: Putting Ideas Together

by Maylon Enrique Bovea Toscano -
Computers have wedged themselves into every facet of our lives—they are what we would use as the symbolic representation of the modern world.

But did you know that the history of computers dates back to the 1800s?

Indeed, the history and evolution of computers is quite extraordinary—and with many early computing technology innovations tied to defense contracts, much of this information were kept secret from the public for decades. In this article, we explore the development and progression of computers.

Mid-1800s-1930s: Early Mechanical Computers
The first computers were designed by Charles Babbage in the mid-1800s, and are sometimes collectively known as the Babbage Engines. These include the Difference Engine No.

1, the Analytical Engine, and the Difference Engine No. 2.
The Difference Engine was constructed from designs by Charles Babbage. Photo by Allan J.

Cronin

These early computers were never completed during Babbage’s lifetime, but their complete designs were preserved. Eventually, one was built in 2002.

While these early mechanical computers bore little resemblance to the computers in use today, they paved the way for a number of technologies that are used by modern computers, or were instrumental in their development. These concepts include of the idea of separating storage from processing, the logical structure of computers, and the way that data and instructions are inputted and outputted.
WebFX Logo
Stylized old-fashioned personal computer with text 'HISTORY OF COMPUTERS IN A NUTSHELL' on a geometric patterned background.
HOME
BLOG
WEB DESIGN
The History of Computers in a Nutshell
Clock Icon
13 min. read
William Craig
William Craig
Verified badge
CEO & Co-Founder
Chevron Down Icon
The History of Computers in a Nutshell

Computers have wedged themselves into every facet of our lives—they are what we would use as the symbolic representation of the modern world.

But did you know that the history of computers dates back to the 1800s?

Indeed, the history and evolution of computers is quite extraordinary—and with many early computing technology innovations tied to defense contracts, much of this information were kept secret from the public for decades. In this article, we explore the development and progression of computers.

Mid-1800s-1930s: Early Mechanical Computers
The first computers were designed by Charles Babbage in the mid-1800s, and are sometimes collectively known as the Babbage Engines. These include the Difference Engine No.

1, the Analytical Engine, and the Difference Engine No. 2.Difference Engine No. 2

The Difference Engine was constructed from designs by Charles Babbage. Photo by Allan J.

Cronin

These early computers were never completed during Babbage’s lifetime, but their complete designs were preserved. Eventually, one was built in 2002.

While these early mechanical computers bore little resemblance to the computers in use today, they paved the way for a number of technologies that are used by modern computers, or were instrumental in their development. These concepts include of the idea of separating storage from processing, the logical structure of computers, and the way that data and instructions are inputted and outputted.

Z1

Z1 was used to take the U.S. Census in 1890.

Other important mechanical computers are the Automatic Electrical Tabulating Machine—which was used in the U.S. Census of 1890 to handle data from more than 62 million Americans—and the first binary computer: Konrad Zuse’s Z1, which was developed in 1938 and was the precursor to the first electro-mechanical computer.

1930s: Electro-Mechanical Computers
Electro-mechanical computers generally worked with relays and/or vacuum tubes, which could be used as switches.

Some electro-mechanical computers—such as the Differential Analyzer built in 1930—used purely mechanical internals but employed electric motors to power them.

These early electro-mechanical computers were either analog or were digital—such as the Model K and the Complex Number Calculator, both produced by George Stibitz.

Stibitz, by the way, was also responsible for the first remote access computing, done at a conference at Dartmouth College in New Hampshire. He took a teleprinter to the conference, leaving his computer in New York City, and then proceeded to take problems posed by the audience.

He then entered the problems on the keypad of his teleprinter, which outputted the answers afterward.
Z3 used floating-point numbers which improved the accuracy of calculations.

It was during the development of these early electro-mechanical computers that many of the technologies and concepts still used today were first developed. The Z3, a descendent of the Z1 developed by Konrad Zuse, was one such pioneering computer.

The Z3 used floating-point numbers in computations and was the first program-controlled digital computer.

Other electro-mechanical computers included Bombes, which were used during WWII to decrypt German codes.

1940s: Electronic Computers
Colossus—whose name was fitting for its size—was developed during World War II.

The first electronic computers were developed during the World War II, with the earliest of those being the Colossus. The Colossus was developed to decrypt secret German codes during the war. It used vacuum tubes and paper tape and could perform a number of Boolean (e.g.

true/false, yes/no) logical operations.
WebFX Logo
Stylized old-fashioned personal computer with text 'HISTORY OF COMPUTERS IN A NUTSHELL' on a geometric patterned background.
HOME
BLOG
WEB DESIGN
The History of Computers in a Nutshell
Clock Icon
13 min. read
William Craig
William Craig
Verified badge
CEO & Co-Founder
Chevron Down Icon
The History of Computers in a Nutshell

Computers have wedged themselves into every facet of our lives—they are what we would use as the symbolic representation of the modern world.

But did you know that the history of computers dates back to the 1800s?

Indeed, the history and evolution of computers is quite extraordinary—and with many early computing technology innovations tied to defense contracts, much of this information were kept secret from the public for decades. In this article, we explore the development and progression of computers.

Mid-1800s-1930s: Early Mechanical Computers
The first computers were designed by Charles Babbage in the mid-1800s, and are sometimes collectively known as the Babbage Engines. These include the Difference Engine No.

1, the Analytical Engine, and the Difference Engine No. 2.Difference Engine No. 2

The Difference Engine was constructed from designs by Charles Babbage. Photo by Allan J.

Cronin

These early computers were never completed during Babbage’s lifetime, but their complete designs were preserved. Eventually, one was built in 2002.

While these early mechanical computers bore little resemblance to the computers in use today, they paved the way for a number of technologies that are used by modern computers, or were instrumental in their development. These concepts include of the idea of separating storage from processing, the logical structure of computers, and the way that data and instructions are inputted and outputted.

Z1

Z1 was used to take the U.S. Census in 1890.

Other important mechanical computers are the Automatic Electrical Tabulating Machine—which was used in the U.S. Census of 1890 to handle data from more than 62 million Americans—and the first binary computer: Konrad Zuse’s Z1, which was developed in 1938 and was the precursor to the first electro-mechanical computer.

1930s: Electro-Mechanical Computers
Electro-mechanical computers generally worked with relays and/or vacuum tubes, which could be used as switches.

Some electro-mechanical computers—such as the Differential Analyzer built in 1930—used purely mechanical internals but employed electric motors to power them.

These early electro-mechanical computers were either analog or were digital—such as the Model K and the Complex Number Calculator, both produced by George Stibitz.

Stibitz, by the way, was also responsible for the first remote access computing, done at a conference at Dartmouth College in New Hampshire. He took a teleprinter to the conference, leaving his computer in New York City, and then proceeded to take problems posed by the audience.

He then entered the problems on the keypad of his teleprinter, which outputted the answers afterward.Z3

Z3 used floating-point numbers which improved the accuracy of calculations.

It was during the development of these early electro-mechanical computers that many of the technologies and concepts still used today were first developed. The Z3, a descendent of the Z1 developed by Konrad Zuse, was one such pioneering computer.

The Z3 used floating-point numbers in computations and was the first program-controlled digital computer.

Other electro-mechanical computers included Bombes, which were used during WWII to decrypt German codes.

1940s: Electronic Computers
Colossus

Colossus—whose name was fitting for its size—was developed during World War II.

The first electronic computers were developed during the World War II, with the earliest of those being the Colossus. The Colossus was developed to decrypt secret German codes during the war. It used vacuum tubes and paper tape and could perform a number of Boolean (e.g.

true/false, yes/no) logical operations.Williams Tube

Williams Tube used RAM for its computations.

Another notable early electronic computer was nicknamed “The Baby” (officially known as the Manchester Small-Scale Experimental Machine). While the computer itself wasn’t remarkable—it was the first computer to use the Williams Tube, a type of random access memory (RAM) that used a cathode-ray tube.

Some early electronic computers used decimal numeric systems (such as the ENIAC and the Harvard Mark 1), while others—like the Atanasoff-Berry Computer and the Colossus Mark 2—used binary systems.

With the exception of the Atanasoff-Berry Computer, all the major models were programmable, either using punch cards, patch cables and switches, or through stored programs in memory.

1950s: The First Commercial Computers
The first commercially available computers came in the 1950s. While computing up until this time had mainly focused on scientific, mathematical, and defense capabilities, new computers were designed for business functions, such as banking and accounting.

The J. Lyons Company, which was a British catering firm, invested heavily in some of these early computers.

In 1951, LEO (Lyons Electronic Office) became the first computer to run a regular routine office job. By November of that year, they were using the LEO to run a weekly bakery valuations job.
The UNIVAC was the first mass-produced computer.

The UNIVAC was the first commercial computer developed in the U.S., with its first unit delivered to the U.S.

Census Bureau. It was the first mass-produced computer, with more than 45 units eventually produced and sold.

The IBM 701 was another notable development in early commercial computing; it was the first mainframe computer produced by IBM. It was around the same time that the Fortran programming language was being developed (for the 704).
The IBM 650 would cost you $4 million dollars if you bought it today.

A smaller IBM 650 was developed in the mid-1950s, and was popular due to its smaller size and footprint (it still weighed over 900kg, with a separate 1350kg power supply).

They cost the equivalent of almost $4 million today (adjusted for inflation).

Mid-1950s: Transistor Computers
The development of transistors led to the replacement of vacuum tubes, and resulted in significantly smaller computers. In the beginning, they were less reliable than the vacuum tubes they replaced, but they also consumed significantly less power.
WebFX Logo
Stylized old-fashioned personal computer with text 'HISTORY OF COMPUTERS IN A NUTSHELL' on a geometric patterned background.
HOME
BLOG
WEB DESIGN
The History of Computers in a Nutshell
Clock Icon
13 min. read
William Craig
William Craig
Verified badge
CEO & Co-Founder
Chevron Down Icon
The History of Computers in a Nutshell

Computers have wedged themselves into every facet of our lives—they are what we would use as the symbolic representation of the modern world.

But did you know that the history of computers dates back to the 1800s?

Indeed, the history and evolution of computers is quite extraordinary—and with many early computing technology innovations tied to defense contracts, much of this information were kept secret from the public for decades. In this article, we explore the development and progression of computers.

Mid-1800s-1930s: Early Mechanical Computers
The first computers were designed by Charles Babbage in the mid-1800s, and are sometimes collectively known as the Babbage Engines. These include the Difference Engine No.

1, the Analytical Engine, and the Difference Engine No. 2.Difference Engine No. 2

The Difference Engine was constructed from designs by Charles Babbage. Photo by Allan J.

Cronin

These early computers were never completed during Babbage’s lifetime, but their complete designs were preserved. Eventually, one was built in 2002.

While these early mechanical computers bore little resemblance to the computers in use today, they paved the way for a number of technologies that are used by modern computers, or were instrumental in their development. These concepts include of the idea of separating storage from processing, the logical structure of computers, and the way that data and instructions are inputted and outputted.

Z1

Z1 was used to take the U.S. Census in 1890.

Other important mechanical computers are the Automatic Electrical Tabulating Machine—which was used in the U.S. Census of 1890 to handle data from more than 62 million Americans—and the first binary computer: Konrad Zuse’s Z1, which was developed in 1938 and was the precursor to the first electro-mechanical computer.

1930s: Electro-Mechanical Computers
Electro-mechanical computers generally worked with relays and/or vacuum tubes, which could be used as switches.

Some electro-mechanical computers—such as the Differential Analyzer built in 1930—used purely mechanical internals but employed electric motors to power them.

These early electro-mechanical computers were either analog or were digital—such as the Model K and the Complex Number Calculator, both produced by George Stibitz.

Stibitz, by the way, was also responsible for the first remote access computing, done at a conference at Dartmouth College in New Hampshire. He took a teleprinter to the conference, leaving his computer in New York City, and then proceeded to take problems posed by the audience.

He then entered the problems on the keypad of his teleprinter, which outputted the answers afterward.Z3

Z3 used floating-point numbers which improved the accuracy of calculations.

It was during the development of these early electro-mechanical computers that many of the technologies and concepts still used today were first developed. The Z3, a descendent of the Z1 developed by Konrad Zuse, was one such pioneering computer.

The Z3 used floating-point numbers in computations and was the first program-controlled digital computer.

Other electro-mechanical computers included Bombes, which were used during WWII to decrypt German codes.

1940s: Electronic Computers
Colossus

Colossus—whose name was fitting for its size—was developed during World War II.

The first electronic computers were developed during the World War II, with the earliest of those being the Colossus. The Colossus was developed to decrypt secret German codes during the war. It used vacuum tubes and paper tape and could perform a number of Boolean (e.g.

true/false, yes/no) logical operations.Williams Tube

Williams Tube used RAM for its computations.

Another notable early electronic computer was nicknamed “The Baby” (officially known as the Manchester Small-Scale Experimental Machine). While the computer itself wasn’t remarkable—it was the first computer to use the Williams Tube, a type of random access memory (RAM) that used a cathode-ray tube.

Some early electronic computers used decimal numeric systems (such as the ENIAC and the Harvard Mark 1), while others—like the Atanasoff-Berry Computer and the Colossus Mark 2—used binary systems.

With the exception of the Atanasoff-Berry Computer, all the major models were programmable, either using punch cards, patch cables and switches, or through stored programs in memory.

1950s: The First Commercial Computers
The first commercially available computers came in the 1950s. While computing up until this time had mainly focused on scientific, mathematical, and defense capabilities, new computers were designed for business functions, such as banking and accounting.

The J. Lyons Company, which was a British catering firm, invested heavily in some of these early computers.

In 1951, LEO (Lyons Electronic Office) became the first computer to run a regular routine office job. By November of that year, they were using the LEO to run a weekly bakery valuations job.UNIVAC

The UNIVAC was the first mass-produced computer.

The UNIVAC was the first commercial computer developed in the U.S., with its first unit delivered to the U.S.

Census Bureau. It was the first mass-produced computer, with more than 45 units eventually produced and sold.

The IBM 701 was another notable development in early commercial computing; it was the first mainframe computer produced by IBM. It was around the same time that the Fortran programming language was being developed (for the 704).

IBM 650

The IBM 650 would cost you $4 million dollars if you bought it today.

A smaller IBM 650 was developed in the mid-1950s, and was popular due to its smaller size and footprint (it still weighed over 900kg, with a separate 1350kg power supply).

They cost the equivalent of almost $4 million today (adjusted for inflation).

Mid-1950s: Transistor Computers
The development of transistors led to the replacement of vacuum tubes, and resulted in significantly smaller computers. In the beginning, they were less reliable than the vacuum tubes they replaced, but they also consumed significantly less power.RAMAC

IBM 350 RAMAC used disk drives.

These transistors also led to developments in computer peripherals.

The first disk drive, the IBM 350 RAMAC, was the first of these introduced in 1956. Remote terminals also became more common with these second-generation computers.

1960s: The Microchip and the Microprocessor
The microchip (or integrated circuit) is one of the most important advances in computing technology. Many overlaps in history existed between microchip-based computers and transistor-based computers throughout the 1960s, and even into the early 1970s.
WebFX Logo
Stylized old-fashioned personal computer with text 'HISTORY OF COMPUTERS IN A NUTSHELL' on a geometric patterned background.
HOME
BLOG
WEB DESIGN
The History of Computers in a Nutshell
Clock Icon
13 min. read
William Craig
William Craig
Verified badge
CEO & Co-Founder
Chevron Down Icon
The History of Computers in a Nutshell

Computers have wedged themselves into every facet of our lives—they are what we would use as the symbolic representation of the modern world.

But did you know that the history of computers dates back to the 1800s?

Indeed, the history and evolution of computers is quite extraordinary—and with many early computing technology innovations tied to defense contracts, much of this information were kept secret from the public for decades. In this article, we explore the development and progression of computers.

Mid-1800s-1930s: Early Mechanical Computers
The first computers were designed by Charles Babbage in the mid-1800s, and are sometimes collectively known as the Babbage Engines. These include the Difference Engine No.

1, the Analytical Engine, and the Difference Engine No. 2.Difference Engine No. 2

The Difference Engine was constructed from designs by Charles Babbage. Photo by Allan J.

Cronin

These early computers were never completed during Babbage’s lifetime, but their complete designs were preserved. Eventually, one was built in 2002.

While these early mechanical computers bore little resemblance to the computers in use today, they paved the way for a number of technologies that are used by modern computers, or were instrumental in their development. These concepts include of the idea of separating storage from processing, the logical structure of computers, and the way that data and instructions are inputted and outputted.

Z1

Z1 was used to take the U.S. Census in 1890.

Other important mechanical computers are the Automatic Electrical Tabulating Machine—which was used in the U.S. Census of 1890 to handle data from more than 62 million Americans—and the first binary computer: Konrad Zuse’s Z1, which was developed in 1938 and was the precursor to the first electro-mechanical computer.

1930s: Electro-Mechanical Computers
Electro-mechanical computers generally worked with relays and/or vacuum tubes, which could be used as switches.

Some electro-mechanical computers—such as the Differential Analyzer built in 1930—used purely mechanical internals but employed electric motors to power them.

These early electro-mechanical computers were either analog or were digital—such as the Model K and the Complex Number Calculator, both produced by George Stibitz.

Stibitz, by the way, was also responsible for the first remote access computing, done at a conference at Dartmouth College in New Hampshire. He took a teleprinter to the conference, leaving his computer in New York City, and then proceeded to take problems posed by the audience.

He then entered the problems on the keypad of his teleprinter, which outputted the answers afterward.Z3

Z3 used floating-point numbers which improved the accuracy of calculations.

It was during the development of these early electro-mechanical computers that many of the technologies and concepts still used today were first developed. The Z3, a descendent of the Z1 developed by Konrad Zuse, was one such pioneering computer.

The Z3 used floating-point numbers in computations and was the first program-controlled digital computer.

Other electro-mechanical computers included Bombes, which were used during WWII to decrypt German codes.

1940s: Electronic Computers
Colossus

Colossus—whose name was fitting for its size—was developed during World War II.

The first electronic computers were developed during the World War II, with the earliest of those being the Colossus. The Colossus was developed to decrypt secret German codes during the war. It used vacuum tubes and paper tape and could perform a number of Boolean (e.g.

true/false, yes/no) logical operations.Williams Tube

Williams Tube used RAM for its computations.

Another notable early electronic computer was nicknamed “The Baby” (officially known as the Manchester Small-Scale Experimental Machine). While the computer itself wasn’t remarkable—it was the first computer to use the Williams Tube, a type of random access memory (RAM) that used a cathode-ray tube.

Some early electronic computers used decimal numeric systems (such as the ENIAC and the Harvard Mark 1), while others—like the Atanasoff-Berry Computer and the Colossus Mark 2—used binary systems.

With the exception of the Atanasoff-Berry Computer, all the major models were programmable, either using punch cards, patch cables and switches, or through stored programs in memory.

1950s: The First Commercial Computers
The first commercially available computers came in the 1950s. While computing up until this time had mainly focused on scientific, mathematical, and defense capabilities, new computers were designed for business functions, such as banking and accounting.

The J. Lyons Company, which was a British catering firm, invested heavily in some of these early computers.

In 1951, LEO (Lyons Electronic Office) became the first computer to run a regular routine office job. By November of that year, they were using the LEO to run a weekly bakery valuations job.UNIVAC

The UNIVAC was the first mass-produced computer.

The UNIVAC was the first commercial computer developed in the U.S., with its first unit delivered to the U.S.

Census Bureau. It was the first mass-produced computer, with more than 45 units eventually produced and sold.

The IBM 701 was another notable development in early commercial computing; it was the first mainframe computer produced by IBM. It was around the same time that the Fortran programming language was being developed (for the 704).

IBM 650

The IBM 650 would cost you $4 million dollars if you bought it today.

A smaller IBM 650 was developed in the mid-1950s, and was popular due to its smaller size and footprint (it still weighed over 900kg, with a separate 1350kg power supply).

They cost the equivalent of almost $4 million today (adjusted for inflation).

Mid-1950s: Transistor Computers
The development of transistors led to the replacement of vacuum tubes, and resulted in significantly smaller computers. In the beginning, they were less reliable than the vacuum tubes they replaced, but they also consumed significantly less power.RAMAC

IBM 350 RAMAC used disk drives.

These transistors also led to developments in computer peripherals.

The first disk drive, the IBM 350 RAMAC, was the first of these introduced in 1956. Remote terminals also became more common with these second-generation computers.

1960s: The Microchip and the Microprocessor
The microchip (or integrated circuit) is one of the most important advances in computing technology. Many overlaps in history existed between microchip-based computers and transistor-based computers throughout the 1960s, and even into the early 1970s.06 10 microcontroller

Micochips allowed the manufacturing of smaller computers.

Photo by Ioan Sameli The microchip spurred the production of minicomputers and microcomputers, which were small and inexpensive enough for small businesses and even individuals to own.

The microchip also led to the microprocessor, another breakthrough technology that was important in the development of the personal computer.

There were three microprocessor designs that came out at about the same time. The first was produced by Intel (the 4004). Soon after, models from Texas Instruments (the TMS 1000) and Garret AiResearch (the Central Air Data Computer, or CADC) followed.

The first processors were 4-bit, but 8-bit models quickly followed by 1972.

16-bit models were produced in 1973, and 32-bit models soon followed.

AT&T Bell Labs created the first fully 32-bit single-chip microprocessor, which used 32-bit buses, 32-bit data paths, and 32-bit addresses, in 1980.

The first 64-bit microprocessors were in use in the early 1990s in some markets, though they didn’t appear in the PC market until the early 2000s.

1970s: Personal Computers
The first personal computers were built in the early 1970s. Most of these were limited-production runs, and worked based on small-scale integrated circuits and multi-chip CPUs.
The Commodore PET was a personal computer in the 70s.

Photo by Tomislav Medak

The Altair 8800 was the first popular computer using a single-chip microprocessor. It was also sold in kit form to electronics hobbyists, meaning purchasers had to assemble their own computers.

Clones of this machine quickly cropped up, and soon there was an entire market based on the design and architecture of the 8800. It also spawned a club based around hobbyist computer builders, the Homebrew Computer Club.

1977 saw the rise of the “Trinity” (based on a reference in Byte magazine): the Commodore PET, the Apple II, and the Tandy Corporation’s TRS-80.

These three computer models eventually went on to sell millions. These early PCs had between 4kB and 48kB of RAM. The Apple II was the only one with a full-color, graphics-capable display, and eventually became the best-seller among the trinity, with more than 4 million units sold.

1980s-1990s: The Early Notebooks and Laptops
One particularly notable development in the 1980s was the advent of the commercially available portable computer.
Osborne 1 was small and portable enough to transport. Photo by Tomislav Medak

The first of these was the Osborne 1, in 1981. It had a tiny 5″ monitor and was large and heavy compared to modern laptops (weighing in at 23.5 pounds). Portable computers continued to develop, though, and eventually became streamlined and easily portable, as the notebooks we have today are.

These early portable computers were portable only in the most technical sense of the word. Generally, they were anywhere from the size of a large electric typewriter to thesize of a suitcase.
WebFX Logo
Stylized old-fashioned personal computer with text 'HISTORY OF COMPUTERS IN A NUTSHELL' on a geometric patterned background.
HOME
BLOG
WEB DESIGN
The History of Computers in a Nutshell
Clock Icon
13 min. read
William Craig
William Craig
Verified badge
CEO & Co-Founder
Chevron Down Icon
The History of Computers in a Nutshell

Computers have wedged themselves into every facet of our lives—they are what we would use as the symbolic representation of the modern world.

But did you know that the history of computers dates back to the 1800s?

Indeed, the history and evolution of computers is quite extraordinary—and with many early computing technology innovations tied to defense contracts, much of this information were kept secret from the public for decades. In this article, we explore the development and progression of computers.

Mid-1800s-1930s: Early Mechanical Computers
The first computers were designed by Charles Babbage in the mid-1800s, and are sometimes collectively known as the Babbage Engines. These include the Difference Engine No.

1, the Analytical Engine, and the Difference Engine No. 2.Difference Engine No. 2

The Difference Engine was constructed from designs by Charles Babbage. Photo by Allan J.

Cronin

These early computers were never completed during Babbage’s lifetime, but their complete designs were preserved. Eventually, one was built in 2002.

While these early mechanical computers bore little resemblance to the computers in use today, they paved the way for a number of technologies that are used by modern computers, or were instrumental in their development. These concepts include of the idea of separating storage from processing, the logical structure of computers, and the way that data and instructions are inputted and outputted.

Z1

Z1 was used to take the U.S. Census in 1890.

Other important mechanical computers are the Automatic Electrical Tabulating Machine—which was used in the U.S. Census of 1890 to handle data from more than 62 million Americans—and the first binary computer: Konrad Zuse’s Z1, which was developed in 1938 and was the precursor to the first electro-mechanical computer.

1930s: Electro-Mechanical Computers
Electro-mechanical computers generally worked with relays and/or vacuum tubes, which could be used as switches.

Some electro-mechanical computers—such as the Differential Analyzer built in 1930—used purely mechanical internals but employed electric motors to power them.

These early electro-mechanical computers were either analog or were digital—such as the Model K and the Complex Number Calculator, both produced by George Stibitz.

Stibitz, by the way, was also responsible for the first remote access computing, done at a conference at Dartmouth College in New Hampshire. He took a teleprinter to the conference, leaving his computer in New York City, and then proceeded to take problems posed by the audience.

He then entered the problems on the keypad of his teleprinter, which outputted the answers afterward.Z3

Z3 used floating-point numbers which improved the accuracy of calculations.

It was during the development of these early electro-mechanical computers that many of the technologies and concepts still used today were first developed. The Z3, a descendent of the Z1 developed by Konrad Zuse, was one such pioneering computer.

The Z3 used floating-point numbers in computations and was the first program-controlled digital computer.

Other electro-mechanical computers included Bombes, which were used during WWII to decrypt German codes.

1940s: Electronic Computers
Colossus

Colossus—whose name was fitting for its size—was developed during World War II.

The first electronic computers were developed during the World War II, with the earliest of those being the Colossus. The Colossus was developed to decrypt secret German codes during the war. It used vacuum tubes and paper tape and could perform a number of Boolean (e.g.

true/false, yes/no) logical operations.Williams Tube

Williams Tube used RAM for its computations.

Another notable early electronic computer was nicknamed “The Baby” (officially known as the Manchester Small-Scale Experimental Machine). While the computer itself wasn’t remarkable—it was the first computer to use the Williams Tube, a type of random access memory (RAM) that used a cathode-ray tube.

Some early electronic computers used decimal numeric systems (such as the ENIAC and the Harvard Mark 1), while others—like the Atanasoff-Berry Computer and the Colossus Mark 2—used binary systems.

With the exception of the Atanasoff-Berry Computer, all the major models were programmable, either using punch cards, patch cables and switches, or through stored programs in memory.

1950s: The First Commercial Computers
The first commercially available computers came in the 1950s. While computing up until this time had mainly focused on scientific, mathematical, and defense capabilities, new computers were designed for business functions, such as banking and accounting.

The J. Lyons Company, which was a British catering firm, invested heavily in some of these early computers.

In 1951, LEO (Lyons Electronic Office) became the first computer to run a regular routine office job. By November of that year, they were using the LEO to run a weekly bakery valuations job.UNIVAC

The UNIVAC was the first mass-produced computer.

The UNIVAC was the first commercial computer developed in the U.S., with its first unit delivered to the U.S.

Census Bureau. It was the first mass-produced computer, with more than 45 units eventually produced and sold.

The IBM 701 was another notable development in early commercial computing; it was the first mainframe computer produced by IBM. It was around the same time that the Fortran programming language was being developed (for the 704).

IBM 650

The IBM 650 would cost you $4 million dollars if you bought it today.

A smaller IBM 650 was developed in the mid-1950s, and was popular due to its smaller size and footprint (it still weighed over 900kg, with a separate 1350kg power supply).

They cost the equivalent of almost $4 million today (adjusted for inflation).

Mid-1950s: Transistor Computers
The development of transistors led to the replacement of vacuum tubes, and resulted in significantly smaller computers. In the beginning, they were less reliable than the vacuum tubes they replaced, but they also consumed significantly less power.RAMAC

IBM 350 RAMAC used disk drives.

These transistors also led to developments in computer peripherals.

The first disk drive, the IBM 350 RAMAC, was the first of these introduced in 1956. Remote terminals also became more common with these second-generation computers.

1960s: The Microchip and the Microprocessor
The microchip (or integrated circuit) is one of the most important advances in computing technology. Many overlaps in history existed between microchip-based computers and transistor-based computers throughout the 1960s, and even into the early 1970s.06 10 microcontroller

Micochips allowed the manufacturing of smaller computers.

Photo by Ioan Sameli The microchip spurred the production of minicomputers and microcomputers, which were small and inexpensive enough for small businesses and even individuals to own.

The microchip also led to the microprocessor, another breakthrough technology that was important in the development of the personal computer.

There were three microprocessor designs that came out at about the same time. The first was produced by Intel (the 4004). Soon after, models from Texas Instruments (the TMS 1000) and Garret AiResearch (the Central Air Data Computer, or CADC) followed.

The first processors were 4-bit, but 8-bit models quickly followed by 1972.

16-bit models were produced in 1973, and 32-bit models soon followed.

AT&T Bell Labs created the first fully 32-bit single-chip microprocessor, which used 32-bit buses, 32-bit data paths, and 32-bit addresses, in 1980.

The first 64-bit microprocessors were in use in the early 1990s in some markets, though they didn’t appear in the PC market until the early 2000s.

1970s: Personal Computers
The first personal computers were built in the early 1970s. Most of these were limited-production runs, and worked based on small-scale integrated circuits and multi-chip CPUs.06 11 commodorepet

The Commodore PET was a personal computer in the 70s.

Photo by Tomislav Medak

The Altair 8800 was the first popular computer using a single-chip microprocessor. It was also sold in kit form to electronics hobbyists, meaning purchasers had to assemble their own computers.

Clones of this machine quickly cropped up, and soon there was an entire market based on the design and architecture of the 8800. It also spawned a club based around hobbyist computer builders, the Homebrew Computer Club.

1977 saw the rise of the “Trinity” (based on a reference in Byte magazine): the Commodore PET, the Apple II, and the Tandy Corporation’s TRS-80.

These three computer models eventually went on to sell millions. These early PCs had between 4kB and 48kB of RAM. The Apple II was the only one with a full-color, graphics-capable display, and eventually became the best-seller among the trinity, with more than 4 million units sold.

1980s-1990s: The Early Notebooks and Laptops
One particularly notable development in the 1980s was the advent of the commercially available portable computer.

06 12 osborne1

Osborne 1 was small and portable enough to transport. Photo by Tomislav Medak

The first of these was the Osborne 1, in 1981. It had a tiny 5″ monitor and was large and heavy compared to modern laptops (weighing in at 23.5 pounds). Portable computers continued to develop, though, and eventually became streamlined and easily portable, as the notebooks we have today are.

These early portable computers were portable only in the most technical sense of the word. Generally, they were anywhere from the size of a large electric typewriter to the size of a suitcase.

06 13 gallivan sc

The Gavilan SC was the first PC to be sold as a “laptop”.

The first laptop with a flip form factor, was produced in 1982, but the first portable computer that was actually marketed as a “laptop” was the Gavilan SC in 1983.

Early models had monochrome displays, though there were color displays available starting in 1984 (the Commodore SX-64).

Laptops grew in popularity as they became smaller and lighter. By 1988, displays had reached VGA resolution, and by 1993 they had 256-color screens. From there, resolutions and colors progressed quickly.

Other hardware features added during the 1990s and early 2000s included high-capacity hard drives and optical drives.
WebFX Logo
Stylized old-fashioned personal computer with text 'HISTORY OF COMPUTERS IN A NUTSHELL' on a geometric patterned background.
HOME
BLOG
WEB DESIGN
The History of Computers in a Nutshell
Clock Icon
13 min. read
William Craig
William Craig
Verified badge
CEO & Co-Founder
Chevron Down Icon
The History of Computers in a Nutshell

Computers have wedged themselves into every facet of our lives—they are what we would use as the symbolic representation of the modern world.

But did you know that the history of computers dates back to the 1800s?

Indeed, the history and evolution of computers is quite extraordinary—and with many early computing technology innovations tied to defense contracts, much of this information were kept secret from the public for decades. In this article, we explore the development and progression of computers.

Mid-1800s-1930s: Early Mechanical Computers
The first computers were designed by Charles Babbage in the mid-1800s, and are sometimes collectively known as the Babbage Engines. These include the Difference Engine No.

1, the Analytical Engine, and the Difference Engine No. 2.Difference Engine No. 2

The Difference Engine was constructed from designs by Charles Babbage. Photo by Allan J.

Cronin

These early computers were never completed during Babbage’s lifetime, but their complete designs were preserved. Eventually, one was built in 2002.

While these early mechanical computers bore little resemblance to the computers in use today, they paved the way for a number of technologies that are used by modern computers, or were instrumental in their development. These concepts include of the idea of separating storage from processing, the logical structure of computers, and the way that data and instructions are inputted and outputted.

Z1

Z1 was used to take the U.S. Census in 1890.

Other important mechanical computers are the Automatic Electrical Tabulating Machine—which was used in the U.S. Census of 1890 to handle data from more than 62 million Americans—and the first binary computer: Konrad Zuse’s Z1, which was developed in 1938 and was the precursor to the first electro-mechanical computer.

1930s: Electro-Mechanical Computers
Electro-mechanical computers generally worked with relays and/or vacuum tubes, which could be used as switches.

Some electro-mechanical computers—such as the Differential Analyzer built in 1930—used purely mechanical internals but employed electric motors to power them.

These early electro-mechanical computers were either analog or were digital—such as the Model K and the Complex Number Calculator, both produced by George Stibitz.

Stibitz, by the way, was also responsible for the first remote access computing, done at a conference at Dartmouth College in New Hampshire. He took a teleprinter to the conference, leaving his computer in New York City, and then proceeded to take problems posed by the audience.

He then entered the problems on the keypad of his teleprinter, which outputted the answers afterward.Z3

Z3 used floating-point numbers which improved the accuracy of calculations.

It was during the development of these early electro-mechanical computers that many of the technologies and concepts still used today were first developed. The Z3, a descendent of the Z1 developed by Konrad Zuse, was one such pioneering computer.

The Z3 used floating-point numbers in computations and was the first program-controlled digital computer.

Other electro-mechanical computers included Bombes, which were used during WWII to decrypt German codes.

1940s: Electronic Computers
Colossus

Colossus—whose name was fitting for its size—was developed during World War II.

The first electronic computers were developed during the World War II, with the earliest of those being the Colossus. The Colossus was developed to decrypt secret German codes during the war. It used vacuum tubes and paper tape and could perform a number of Boolean (e.g.

true/false, yes/no) logical operations.Williams Tube

Williams Tube used RAM for its computations.

Another notable early electronic computer was nicknamed “The Baby” (officially known as the Manchester Small-Scale Experimental Machine). While the computer itself wasn’t remarkable—it was the first computer to use the Williams Tube, a type of random access memory (RAM) that used a cathode-ray tube.

Some early electronic computers used decimal numeric systems (such as the ENIAC and the Harvard Mark 1), while others—like the Atanasoff-Berry Computer and the Colossus Mark 2—used binary systems.

With the exception of the Atanasoff-Berry Computer, all the major models were programmable, either using punch cards, patch cables and switches, or through stored programs in memory.

1950s: The First Commercial Computers
The first commercially available computers came in the 1950s. While computing up until this time had mainly focused on scientific, mathematical, and defense capabilities, new computers were designed for business functions, such as banking and accounting.

The J. Lyons Company, which was a British catering firm, invested heavily in some of these early computers.

In 1951, LEO (Lyons Electronic Office) became the first computer to run a regular routine office job. By November of that year, they were using the LEO to run a weekly bakery valuations job.UNIVAC

The UNIVAC was the first mass-produced computer.

The UNIVAC was the first commercial computer developed in the U.S., with its first unit delivered to the U.S.

Census Bureau. It was the first mass-produced computer, with more than 45 units eventually produced and sold.

The IBM 701 was another notable development in early commercial computing; it was the first mainframe computer produced by IBM. It was around the same time that the Fortran programming language was being developed (for the 704).

IBM 650

The IBM 650 would cost you $4 million dollars if you bought it today.

A smaller IBM 650 was developed in the mid-1950s, and was popular due to its smaller size and footprint (it still weighed over 900kg, with a separate 1350kg power supply).

They cost the equivalent of almost $4 million today (adjusted for inflation).

Mid-1950s: Transistor Computers
The development of transistors led to the replacement of vacuum tubes, and resulted in significantly smaller computers. In the beginning, they were less reliable than the vacuum tubes they replaced, but they also consumed significantly less power.RAMAC

IBM 350 RAMAC used disk drives.

These transistors also led to developments in computer peripherals.

The first disk drive, the IBM 350 RAMAC, was the first of these introduced in 1956. Remote terminals also became more common with these second-generation computers.

1960s: The Microchip and the Microprocessor
The microchip (or integrated circuit) is one of the most important advances in computing technology. Many overlaps in history existed between microchip-based computers and transistor-based computers throughout the 1960s, and even into the early 1970s.06 10 microcontroller

Micochips allowed the manufacturing of smaller computers.

Photo by Ioan Sameli The microchip spurred the production of minicomputers and microcomputers, which were small and inexpensive enough for small businesses and even individuals to own.

The microchip also led to the microprocessor, another breakthrough technology that was important in the development of the personal computer.

There were three microprocessor designs that came out at about the same time. The first was produced by Intel (the 4004). Soon after, models from Texas Instruments (the TMS 1000) and Garret AiResearch (the Central Air Data Computer, or CADC) followed.

The first processors were 4-bit, but 8-bit models quickly followed by 1972.

16-bit models were produced in 1973, and 32-bit models soon followed.

AT&T Bell Labs created the first fully 32-bit single-chip microprocessor, which used 32-bit buses, 32-bit data paths, and 32-bit addresses, in 1980.

The first 64-bit microprocessors were in use in the early 1990s in some markets, though they didn’t appear in the PC market until the early 2000s.

1970s: Personal Computers
The first personal computers were built in the early 1970s. Most of these were limited-production runs, and worked based on small-scale integrated circuits and multi-chip CPUs.06 11 commodorepet

The Commodore PET was a personal computer in the 70s.

Photo by Tomislav Medak

The Altair 8800 was the first popular computer using a single-chip microprocessor. It was also sold in kit form to electronics hobbyists, meaning purchasers had to assemble their own computers.

Clones of this machine quickly cropped up, and soon there was an entire market based on the design and architecture of the 8800. It also spawned a club based around hobbyist computer builders, the Homebrew Computer Club.

1977 saw the rise of the “Trinity” (based on a reference in Byte magazine): the Commodore PET, the Apple II, and the Tandy Corporation’s TRS-80.

These three computer models eventually went on to sell millions. These early PCs had between 4kB and 48kB of RAM. The Apple II was the only one with a full-color, graphics-capable display, and eventually became the best-seller among the trinity, with more than 4 million units sold.

1980s-1990s: The Early Notebooks and Laptops
One particularly notable development in the 1980s was the advent of the commercially available portable computer.

06 12 osborne1

Osborne 1 was small and portable enough to transport. Photo by Tomislav Medak

The first of these was the Osborne 1, in 1981. It had a tiny 5″ monitor and was large and heavy compared to modern laptops (weighing in at 23.5 pounds). Portable computers continued to develop, though, and eventually became streamlined and easily portable, as the notebooks we have today are.

These early portable computers were portable only in the most technical sense of the word. Generally, they were anywhere from the size of a large electric typewriter to the size of a suitcase.

06 13 gallivan sc

The Gavilan SC was the first PC to be sold as a “laptop”.

The first laptop with a flip form factor, was produced in 1982, but the first portable computer that was actually marketed as a “laptop” was the Gavilan SC in 1983.

Early models had monochrome displays, though there were color displays available starting in 1984 (the Commodore SX-64).

Laptops grew in popularity as they became smaller and lighter. By 1988, displays had reached VGA resolution, and by 1993 they had 256-color screens. From there, resolutions and colors progressed quickly.

Other hardware features added during the 1990s and early 2000s included high-capacity hard drives and optical drives.06 14 macbookpros

Laptops typically come in three categories, as shown by these Macbooks. Photo by Benjamin Nagel

Laptops are generally broken down into a three different categories:

Desktop replacements
Standard notebooks
Subnotebooks
Desktop replacements are usually larger, with displays of 15-17″ and performance comparable with some better desktop computers.

Standard notebooks usually have displays of 13-15″ and are a good compromise between performance and portability.

Subnotebooks, including netbooks, have displays smaller than 13″ and fewer features than standard notebooks.

2000s: The Rise of Mobile Computing
Mobile computing is one of the most recent major milestones in the history of computers.

Many smartphones today have higher processor speeds and more memory than desktop PCs had even ten years ago. With phones like the iPhone and the Motorola Droid, it’s becoming possible to perform most of the functions once reserved for desktop PCs from anywher.
The Droid is a smartphone capable of basic computing tasks such as emailing and web browsing.

Mobile computing really got its start in the 1980s, with the pocket PCs of the era.

These were something like a cross between a calculator, a small home computer and a PDA. They largely fell out of favor by the 1990s. During the 1990s, PDAs (Personal Digital Assistant) became popular.

A number of manufacturers had models, including Apple and Palm.

The main feature PDAs had that not all pocket PCs had was a touchscreen interface. PDAs are still manufactured and used today, though they’ve largely been replaced by smartphones.

Smartphones have truly revolutionized mobile computing. Most basic computing functions can now be done on a smartphone, such as email, browsing the internet, and uploading photos and videos.

Late 2000s: Netbooks
Another recent progression in computing history is the development of netbook computers.

Netbooks are smaller and more portable than standard laptops, while still being capable of performing most functions average computer users need (using the Internet, managing email, and using basic office programs). Some netbooks go as far as to have not only built-in WiFi capabilities, but also built-in mobile broadband connectivity options.
WebFX Logo
Stylized old-fashioned personal computer with text 'HISTORY OF COMPUTERS IN A NUTSHELL' on a geometric patterned background.
HOME
BLOG
WEB DESIGN
The History of Computers in a Nutshell
Clock Icon
13 min. read
William Craig
William Craig
Verified badge
CEO & Co-Founder
Chevron Down Icon
The History of Computers in a Nutshell

Computers have wedged themselves into every facet of our lives—they are what we would use as the symbolic representation of the modern world.

But did you know that the history of computers dates back to the 1800s?

Indeed, the history and evolution of computers is quite extraordinary—and with many early computing technology innovations tied to defense contracts, much of this information were kept secret from the public for decades. In this article, we explore the development and progression of computers.

Mid-1800s-1930s: Early Mechanical Computers
The first computers were designed by Charles Babbage in the mid-1800s, and are sometimes collectively known as the Babbage Engines. These include the Difference Engine No.

1, the Analytical Engine, and the Difference Engine No. 2.Difference Engine No. 2

The Difference Engine was constructed from designs by Charles Babbage. Photo by Allan J.

Cronin

These early computers were never completed during Babbage’s lifetime, but their complete designs were preserved. Eventually, one was built in 2002.

While these early mechanical computers bore little resemblance to the computers in use today, they paved the way for a number of technologies that are used by modern computers, or were instrumental in their development. These concepts include of the idea of separating storage from processing, the logical structure of computers, and the way that data and instructions are inputted and outputted.

Z1

Z1 was used to take the U.S. Census in 1890.

Other important mechanical computers are the Automatic Electrical Tabulating Machine—which was used in the U.S. Census of 1890 to handle data from more than 62 million Americans—and the first binary computer: Konrad Zuse’s Z1, which was developed in 1938 and was the precursor to the first electro-mechanical computer.

1930s: Electro-Mechanical Computers
Electro-mechanical computers generally worked with relays and/or vacuum tubes, which could be used as switches.

Some electro-mechanical computers—such as the Differential Analyzer built in 1930—used purely mechanical internals but employed electric motors to power them.

These early electro-mechanical computers were either analog or were digital—such as the Model K and the Complex Number Calculator, both produced by George Stibitz.

Stibitz, by the way, was also responsible for the first remote access computing, done at a conference at Dartmouth College in New Hampshire. He took a teleprinter to the conference, leaving his computer in New York City, and then proceeded to take problems posed by the audience.

He then entered the problems on the keypad of his teleprinter, which outputted the answers afterward.Z3

Z3 used floating-point numbers which improved the accuracy of calculations.

It was during the development of these early electro-mechanical computers that many of the technologies and concepts still used today were first developed. The Z3, a descendent of the Z1 developed by Konrad Zuse, was one such pioneering computer.

The Z3 used floating-point numbers in computations and was the first program-controlled digital computer.

Other electro-mechanical computers included Bombes, which were used during WWII to decrypt German codes.

1940s: Electronic Computers
Colossus

Colossus—whose name was fitting for its size—was developed during World War II.

The first electronic computers were developed during the World War II, with the earliest of those being the Colossus. The Colossus was developed to decrypt secret German codes during the war. It used vacuum tubes and paper tape and could perform a number of Boolean (e.g.

true/false, yes/no) logical operations.Williams Tube

Williams Tube used RAM for its computations.

Another notable early electronic computer was nicknamed “The Baby” (officially known as the Manchester Small-Scale Experimental Machine). While the computer itself wasn’t remarkable—it was the first computer to use the Williams Tube, a type of random access memory (RAM) that used a cathode-ray tube.

Some early electronic computers used decimal numeric systems (such as the ENIAC and the Harvard Mark 1), while others—like the Atanasoff-Berry Computer and the Colossus Mark 2—used binary systems.

With the exception of the Atanasoff-Berry Computer, all the major models were programmable, either using punch cards, patch cables and switches, or through stored programs in memory.

1950s: The First Commercial Computers
The first commercially available computers came in the 1950s. While computing up until this time had mainly focused on scientific, mathematical, and defense capabilities, new computers were designed for business functions, such as banking and accounting.

The J. Lyons Company, which was a British catering firm, invested heavily in some of these early computers.

In 1951, LEO (Lyons Electronic Office) became the first computer to run a regular routine office job. By November of that year, they were using the LEO to run a weekly bakery valuations job.UNIVAC

The UNIVAC was the first mass-produced computer.

The UNIVAC was the first commercial computer developed in the U.S., with its first unit delivered to the U.S.

Census Bureau. It was the first mass-produced computer, with more than 45 units eventually produced and sold.

The IBM 701 was another notable development in early commercial computing; it was the first mainframe computer produced by IBM. It was around the same time that the Fortran programming language was being developed (for the 704).

IBM 650

The IBM 650 would cost you $4 million dollars if you bought it today.

A smaller IBM 650 was developed in the mid-1950s, and was popular due to its smaller size and footprint (it still weighed over 900kg, with a separate 1350kg power supply).

They cost the equivalent of almost $4 million today (adjusted for inflation).

Mid-1950s: Transistor Computers
The development of transistors led to the replacement of vacuum tubes, and resulted in significantly smaller computers. In the beginning, they were less reliable than the vacuum tubes they replaced, but they also consumed significantly less power.RAMAC

IBM 350 RAMAC used disk drives.

These transistors also led to developments in computer peripherals.

The first disk drive, the IBM 350 RAMAC, was the first of these introduced in 1956. Remote terminals also became more common with these second-generation computers.

1960s: The Microchip and the Microprocessor
The microchip (or integrated circuit) is one of the most important advances in computing technology. Many overlaps in history existed between microchip-based computers and transistor-based computers throughout the 1960s, and even into the early 1970s.06 10 microcontroller

Micochips allowed the manufacturing of smaller computers.

Photo by Ioan Sameli The microchip spurred the production of minicomputers and microcomputers, which were small and inexpensive enough for small businesses and even individuals to own.

The microchip also led to the microprocessor, another breakthrough technology that was important in the development of the personal computer.

There were three microprocessor designs that came out at about the same time. The first was produced by Intel (the 4004). Soon after, models from Texas Instruments (the TMS 1000) and Garret AiResearch (the Central Air Data Computer, or CADC) followed.

The first processors were 4-bit, but 8-bit models quickly followed by 1972.

16-bit models were produced in 1973, and 32-bit models soon followed.

AT&T Bell Labs created the first fully 32-bit single-chip microprocessor, which used 32-bit buses, 32-bit data paths, and 32-bit addresses, in 1980.

The first 64-bit microprocessors were in use in the early 1990s in some markets, though they didn’t appear in the PC market until the early 2000s.

1970s: Personal Computers
The first personal computers were built in the early 1970s. Most of these were limited-production runs, and worked based on small-scale integrated circuits and multi-chip CPUs.06 11 commodorepet

The Commodore PET was a personal computer in the 70s.

Photo by Tomislav Medak

The Altair 8800 was the first popular computer using a single-chip microprocessor. It was also sold in kit form to electronics hobbyists, meaning purchasers had to assemble their own computers.

Clones of this machine quickly cropped up, and soon there was an entire market based on the design and architecture of the 8800. It also spawned a club based around hobbyist computer builders, the Homebrew Computer Club.

1977 saw the rise of the “Trinity” (based on a reference in Byte magazine): the Commodore PET, the Apple II, and the Tandy Corporation’s TRS-80.

These three computer models eventually went on to sell millions. These early PCs had between 4kB and 48kB of RAM. The Apple II was the only one with a full-color, graphics-capable display, and eventually became the best-seller among the trinity, with more than 4 million units sold.

1980s-1990s: The Early Notebooks and Laptops
One particularly notable development in the 1980s was the advent of the commercially available portable computer.

06 12 osborne1

Osborne 1 was small and portable enough to transport. Photo by Tomislav Medak

The first of these was the Osborne 1, in 1981. It had a tiny 5″ monitor and was large and heavy compared to modern laptops (weighing in at 23.5 pounds). Portable computers continued to develop, though, and eventually became streamlined and easily portable, as the notebooks we have today are.

These early portable computers were portable only in the most technical sense of the word. Generally, they were anywhere from the size of a large electric typewriter to the size of a suitcase.

06 13 gallivan sc

The Gavilan SC was the first PC to be sold as a “laptop”.

The first laptop with a flip form factor, was produced in 1982, but the first portable computer that was actually marketed as a “laptop” was the Gavilan SC in 1983.

Early models had monochrome displays, though there were color displays available starting in 1984 (the Commodore SX-64).

Laptops grew in popularity as they became smaller and lighter. By 1988, displays had reached VGA resolution, and by 1993 they had 256-color screens. From there, resolutions and colors progressed quickly.

Other hardware features added during the 1990s and early 2000s included high-capacity hard drives and optical drives.06 14 macbookpros

Laptops typically come in three categories, as shown by these Macbooks. Photo by Benjamin Nagel

Laptops are generally broken down into a three different categories:

Desktop replacements
Standard notebooks
Subnotebooks
Desktop replacements are usually larger, with displays of 15-17″ and performance comparable with some better desktop computers.

Standard notebooks usually have displays of 13-15″ and are a good compromise between performance and portability.

Subnotebooks, including netbooks, have displays smaller than 13″ and fewer features than standard notebooks.

2000s: The Rise of Mobile Computing
Mobile computing is one of the most recent major milestones in the history of computers.

Many smartphones today have higher processor speeds and more memory than desktop PCs had even ten years ago. With phones like the iPhone and the Motorola Droid, it’s becoming possible to perform most of the functions once reserved for desktop PCs from anywhere.2000s: The Rise of Mobile Computing

The Droid is a smartphone capable of basic computing tasks such as emailing and web browsing.

Mobile computing really got its start in the 1980s, with the pocket PCs of the era.

These were something like a cross between a calculator, a small home computer and a PDA. They largely fell out of favor by the 1990s. During the 1990s, PDAs (Personal Digital Assistant) became popular.

A number of manufacturers had models, including Apple and Palm.

The main feature PDAs had that not all pocket PCs had was a touchscreen interface. PDAs are still manufactured and used today, though they’ve largely been replaced by smartphones.

Smartphones have truly revolutionized mobile computing. Most basic computing functions can now be done on a smartphone, such as email, browsing the internet, and uploading photos and videos.

Late 2000s: Netbooks
Another recent progression in computing history is the development of netbook computers.

Netbooks are smaller and more portable than standard laptops, while still being capable of performing most functions average computer users need (using the Internet, managing email, and using basic office programs). Some netbooks go as far as to have not only built-in WiFi capabilities, but also built-in mobile broadband connectivity options.Netbooks

The Asus Eee PC 700 was the first netbook to enter mass production.

The first mass-produced netbook was the Asus Eee PC 700, released in 2007. They were originally released in Asia, but were released in the US not long afterward.

Other manufacturers quickly followed suit, releasing additional models throughout 2008 and 2009.

One of the main advantages of netbooks is their lower cost (generally ranging from around US$200-$600). Some mobile broadband providers have even offered netbooks for free with an extended service contract. Comcast also had a promotion in 2009 that offered a free netbook when you signed up for their cable internet services.

Most netbooks now come with Windows or Linux installed, and soon, there will be Android-based netbooks available from Asus and other manufacturers.

The history of computing spans nearly two centuries at this point, much longer than most people realize.

From the mechanical computers of the 1800s to the room-sized mainframes of the mid-20th century, all the way up to the netbooks and smartphones of today, computers have evolved radically throughout their history.

The past 100 years have brought technological leaps and bounds to computing, and there’s no telling what the next 100 years might bring.