Science and Technology for Today's World

Read this text on the effects of advanced computers and communications technologies on human society, such as the internet and social media. Advances in medical science have succeeded in eradicating diseases that have plagued the world for centuries.

World War II brought about a massive technological transformation as countries like Germany and the United States rapidly innovated to avoid destruction and defeat their enemies. In the decades after the war, there was major progress in medical technology, the creation of new vaccines, and the elimination of deadly diseases. All these achievements had profound effects on the way people lived, traveled, and worked. Underlying them were major advancements in the field of information technologies, such as computers. Once the war was over, the development of increasingly powerful computers ushered in a computer revolution as powerful as the 19th century's Industrial Revolution and, with it, a digital age.


The Digital Computer Revolution

Many of the technological advancements of the 1940s and 1950s came in the form of increasingly powerful analog computers, which analyze a continuous stream of information, much like that recorded on vinyl records. Analog computers worked well for solving big mathematical problems, such as the calculations related to electrical power delivery systems or the study of nuclear physics. However, one of their weaknesses was that they were inefficient at managing large amounts of data. Digital computers, or those that translate information into a complex series of ones and zeros, were far more capable of managing bulk data. Just a few years after the war, digital computing received a huge boost with the invention of the transistor, a device with far more computing potential than its predecessor, the vacuum tube. Scientists could amplify this enlarged computing capacity even further by wiring multiple transistors together in increasingly complex ways.

The use of multiple transistors for computing purposes was an important step, but it had obvious drawbacks. Making machines capable of processing a great deal of information required connecting many transistors, which took up a great deal of space. Then, in the late 1950s, inventors in the United States developed an innovative solution. Using silicon, they could integrate transistors and capacitors in a way that clumsy wiring could not accomplish. The silicon-based integrated circuit freed computer technology from size constraints and opened the door to additional advancements in computing power.

Even so, digital computers remained large, expensive, and complicated to operate, and their use was largely confined to universities and the military. Only gradually over the 1970s did computing technology become more widely available, largely thanks to mass-produced general-purpose computers, sometimes called minicomputers, designed by IBM and the Digital Equipment Company. These served a variety of government and private purposes, such as calculating the Census, managing the flow of tax monies, and processing calculations related to creditworthiness (Figure 15.15). But despite being somewhat cheaper, minicomputers remained out of reach for average users.

A black and white photo is shown of a woman in a light-colored belted dress with short sleeves sitting at a table with a larg

Figure 15.15 Computers, 1950s. In the 1950s, computers required complex data management systems to operate. The U.S. Census Bureau used this device to transfer data from paper questionnaires to microfilm to allow for rapid processing by its computers.


The journey from minicomputers to personal computers began with the Intel Corporation, established in 1968 in Mountain View, California, in a region now commonly called Silicon Valley. During the 1970s, Intel developed a line of integrated circuits that were not only more powerful than their predecessors but also programmable. These became known as microprocessors, and they revolutionized computing by holding all of a computer's processing power in a single integrated circuit. In 1975, a company in New Mexico released the first marketed personal computer, the Altair 8800. This used an Intel microprocessor and was promoted to computer hobbyists eager to wield a level of computing power once available to only a few. The Altair's popularity inspired competing products like the Apple, the Commodore, and the Tandy Radio Shack computer (Figure 15.16). These personal computer systems were far easier to use and appealed to a much larger market than just hobbyists.

On a wooden desk sits an old computer with a deep monitor and a green screen with words in a basic font, attached to a thick

Figure 15.16 Personal Computer, 1980s. The Tandy Color Computer 3 shown here, released in 1986 and nicknamed the CoCo 3, was one of many personal computers released in the 1980s that average consumers were able to buy for their homes.

By 1982, there were 5.5 million personal computers in the United States, and over the next decade, their number and computing power rose exponentially. Computers proliferated in government offices, private firms, and family homes. Then, in 1984, Apple introduced the world to the Macintosh computer, which not only used a mouse but also replaced the standard code-based user interface with one based on graphics and icons. Recognizing the user-friendly possibilities of this graphic interface, competitors followed suit. Before long, the design popularized by Apple had become the norm.

By the end of the 1980s, not only had personal computers become common, but the microprocessor itself could be found everywhere. Microprocessors were incorporated into automobiles, cash registers, televisions, and household appliances and made possible a variety of other electronic devices like videocassette recorders and video game systems (Figure 15.17). Computer systems were created to store and manage financial, educational, and healthcare information. In one form or another, and whether they realized it or not, by the 1990s, almost everyone in the developed world was interacting with computers.

A picture is shown of a black rectangular item with a brown wooden front. There are long grooves along the front on top and a

Figure 15.17 The Atari Video Computer System, 1977. Released in 1977, the Atari Video Computer System could be connected to almost any television, allowing users to play video games at home. The software was stored on small plastic cartridges that were plugged directly into the machine.

Modems were hardly new in the 1990s, but they became much faster and more common with the rise of the Internet. The origins of the internet date back to the 1960s and the efforts by government researchers in the United States to use computers to share information. These developments were especially important for the U.S. Department of Defense during the Cold War and resulted in the emergence of the Advanced Research Projects Agency Network (ARPANET). In creating ARPANET, researchers developed many of the technologies that, over the next few decades, formed the basis for the internet we know today.


The Internet and Social Media

The process of globalization has been accelerated by the rise of the internet and the various social media platforms like Instagram, Facebook, and Twitter that exist there. Many people were introduced to the potential of computer networks for sharing information and creating small social networks in the 1980s when individual users became able to connect their computers to others by using modems and telephone networks. This connectivity gave rise to regional bulletin board systems (BBSs), in which one person's computer served as a host for those of other users (Figure 15.18). BBSs functioned much like websites today. Though they ran far more slowly and had limited capabilities, they allowed users to share computer files like games and images, post messages for others to read, participate in virtual discussions and debates, and play text-based online games. BBSs used phone networks to communicate, and long-distance calls were then expensive, so their users tended to be local.

An image of a black computer screen with words and numbers typed in white, blue, green and pink all over is shown. Across the

Figure 15.18 A Bulletin Board System, 1980s. Bulletin board systems like this one relied on colorful text and simple graphics to make them appealing. They appear very limited compared to today's websites, but in the 1980s, they were revolutionary and opened new possibilities for the future of communication.


Throughout the 1980s, BBSs continued to be popular with computer hobbyists and those intrigued by the idea of unique virtual communities while networking technology improved steadily behind the scenes. The United States, Europe, and other developed countries were busy adopting a uniform protocol system that would allow computers around the world to easily communicate with others. Once this protocol had been established, the commercial internet, as we currently understand it, was born.

As early as 1987, about 30,000 hosts resided on the burgeoning internet. Soon, telecommunications and software companies began to exploit this new network by creating online service providers like America Online (AOL) to act as gateways to the Internet. Initially, they used standard phone lines and modems to connect, much as BBSs had. But as the volume of information on the internet increased exponentially, service providers turned to more expensive broadband connections that used cable television lines and even dedicated lines to connect. During the 1990s, the first websites, the first internet search engines, and the first commercial internet platforms were established.

By 2005, more than one billion people worldwide were using the internet regularly. They were able to shop online, make phone calls around the world, and even create their own websites with almost no technical training. Never before had the world been so connected. In 2004, Facebook was launched. Originally a networking tool for Harvard students, it quickly expanded globally to become a giant in the new world of social media. By 2010, nearly half a billion Facebook users around the world were sharing images and messages, creating communities, and linking to news stories. By 2022, the number of Facebook users had reached nearly three billion.

Before 2007, almost all internet users gained access to the network via a personal computer, either at home or at work. That year, however, Apple Inc. released the first iPhone, a powerful cell phone but also a portable computer capable of performing all the tasks it once required a desktop computer to do. Even more revolutionary, it connected to the internet wirelessly through cell phone infrastructure. While the iPhone was not the first phone to connect to the Internet, its revolutionary touch-screen interface was far superior to earlier systems. Within just a few years, other cell phone manufacturers were imitating its design and putting smartphones, and thus internet access, in the pockets of users around the world.

Smartphones have transformed life in developing countries, where they have helped bypass some of the traditional stages of infrastructure creation. In Africa, for example, people living where no landlines exist can now communicate with others using cell phones. Small farmers and traders can use cell phones for banking and to connect with potential suppliers and customers. In communities without libraries, schoolchildren can access the internet's resources to study.

Smartphones have also democratized the internet, serving as powerful tools for organizing and promoting political change. The large pro-democracy movement in Cairo's Tahrir Square captured the world's attention in 2011, for example. But it began with 25-year-old activist Asmaa Mahfouz's YouTube video of January 18, 2011, in which she spoke directly to the camera and urged young Egyptians to protest at the square as part of the larger Arab Spring, a call for government reform and democracy that echoed across in the Arab world.

The Arab Spring was touched off in December 2010 when Muhammad Bouazizi, a young college graduate, set himself on fire in Tunisia after government officials there tried to interfere with the fruit cart that was his only source of income. Other young Tunisians took to the streets in protest, and demonstrations began again in January 2011. As people died in confrontations with government forces, President Zine al-Abidine Ben Ali fled the country, and Tunisia's prime minister resigned shortly thereafter.

The Tunisian protests led to similar demonstrations in Egypt. On January 17, 2011, an Egyptian set himself on fire near the nation's Parliament to protest the lack of economic opportunities. Crowds of mostly young people responded with massive demonstrations that lasted weeks (Figure 15.19). These demonstrations were fueled by and broadcast to the world through text messages, photos, tweets, videos, and Facebook posts sent by thousands of mobile phones, including that of Mahfouz. The devices amplified the calls for democracy and showed the world the Egyptian government's use of violence to try to silence the protestors. Egyptian President Hosni Mubarak resigned on February 11, 2011. He was later convicted for his role in ordering government forces to harm and kill protestors.

A picture and an image are shown. (a) A picture shows a large circular area in the middle of a large crowd of people filling

Figure 15.19 The Egyptian Revolution. (a) Internet-connected cell phones using social media applications like Facebook were a common site at the large 2011 protests at Tahrir Square, Cairo. (b) The map shows the results of other uprisings in Africa and the Middle East that were part of the Arab Spring of 2010–2012.


In the wake of the Egyptian protests, activists in Libya, Yemen, Syria, Morocco, Lebanon, Jordan, and other countries coordinated their activities using computers and smartphones to access social media, video, and mobile phone messaging. These efforts resulted in protests, changes to the laws, and even the toppling of governments, such as in Egypt and Tunisia. They also led to civil war in Syria, Iraq, and Libya, leading to thousands of deaths and a refugee crisis in the Mediterranean. While Twitter and Facebook were useful for scaling up protests, the movements to which they gave birth often struggled to find a purpose in countries without a well-established resistance movement.

Since 2011, governments around the world have come to recognize the power of social media to bring about change, and many authoritarian and even ostensibly democratic leaders have moved to limit or block social media use in their countries. China has blocked Facebook and Twitter since 2009 and encourages its citizens to instead use the state-authorized app WeChat, which shares information with the government. In 2020, India banned the social media app TikTok, claiming it threatened state security and public order. In March 2022, following its February invasion of Ukraine, Russia banned Instagram and Facebook because the government alleged the platforms carried messages calling for violence against Russian troops and against Russian President Vladimir Putin. Turkmenistan has gone further than China, India, or Russia. It not only bans Facebook and Twitter, but it also requires citizens applying for internet access to swear they will not try to evade state censorship.

In the United States, lawmakers have recognized that social media platforms like Facebook and Twitter can both promote and endanger democracy. Social media provides extremist groups with the ability to attract followers from across the nation and incite violence. Groups can use the platforms to spread fake news, and a report by the U.S. Senate has concluded that Russian intelligence operatives used Facebook, Twitter, and Instagram to manipulate voters. Legislators have called on social media to more actively censor the content on their platforms and limit or block access by groups or persons spreading hate speech or disinformation. The potential for misuse of technology is heightened by advances that enable the creation of deepfakes, computer-generated images that closely resemble real people.


Medical Miracles and Ongoing Health Challenges

Advances in computer technology were not the only technological success stories of the post–World War II world. In 1947, scientists perfected an artificial kidney, and just five years later, the first successful kidney transplant was performed. In the 1950s, antipsychotic drugs were developed and used to treat neurological disorders that once consigned patients to a lifetime of difficult treatment in a psychiatric hospital. In the 1950s, geneticists discovered the double-helix structure of DNA, information that was crucial for later advancements such as the ability to use DNA to diagnose and treat genetic diseases. In 1962, a surgical team successfully reattached a severed limb for the first time, and in 1967, the first human heart transplant took place. Over the next decade and a half, medical advances made it possible to conduct telemedicine, view and monitor internal organs without performing surgery, and monitor the heartbeat of a fetus during pregnancy.

Medical science also made enormous gains in eradicating diseases that had been common for centuries. For example, polio has caused paralysis and even death since the late 19th century. In 1950, the first successful polio vaccine, developed by the Polish-born virologist Hilary Koprowski, was demonstrated as effective in children. This was an orally ingested live vaccine, a weakened form of the virus designed to help the immune system develop antibodies. In the meantime, researcher Jonas Salk at the University of Pittsburgh was developing an injectable dead-virus vaccine (Figure 15.20). This vaccine rendered the virus inactive but still triggered the body to produce antibodies. In 1955, Salk's vaccine was licensed for use in the United States, and mass distribution began there. Other vaccines were developed in the United States and other countries over the next several years. Their use has nearly eradicated polio cases, which once numbered in the hundreds of thousands. When polio was detected in an adult in New York in July 2022, it was the first case in the United States since 2013.

A black and white photo of a serious looking man is shown. He has dark hair, a large, balding forehead, and wears a suit, whi

Figure 15.20 Jonas Salk. After the polio vaccine he developed proved successful, Dr. Jonas Salk chose not to patent it to ensure it would be used freely around the world.


The eradication of smallpox is another important success story. Centuries ago, smallpox devastated communities around the world, especially Native American groups, which had no immunity to the disease when Europeans brought it to their shores. Early vaccines based on the cowpox virus were deployed in the United States and Europe in the 18th century with great effect. In the 20th century, advancements made the vaccine safer and easier to administer. However, by the 1950s, much of the world remained unvaccinated and susceptible. Beginning in 1959, the World Health Organization (WHO) began working to eradicate smallpox through mass vaccination, redoubling efforts in 1967 through its Intensified Eradication Program. During the 1970s, smallpox was eradicated in South America, Asia, and Africa. In 1980, the WHO declared it had been eliminated globally.

The WHO's smallpox program is considered the most effective disease-eradication initiative in history, but it was an aggressive campaign not easily replicated. And without a vaccine, the problems of controlling transmissible diseases can be immense. A novel disease was first reported among Los Angeles's gay community in 1981, and by 1982 it had become known as AIDS (acquired immunodeficiency syndrome). Researchers realized it was commonly transmitted through sexual intercourse but could also be passed by shared needles and blood transfusions. At that time, the U.S. Centers for Disease Control explained that AIDS was not transmitted through casual contact. However, the information did little to calm rising concerns about this still largely mysterious and deadly disease.

By 1987, more than 60,000 people in the world had died of AIDS. In the United States, the government was slow to fund research to develop treatments or to find a cure. That year, activists at the Lesbian and Gay Community Services Center in New York City were concerned with the toll that AIDS was taking on the gay community and the government's seeming lack of concern regarding a disease that the media depicted as affecting primarily gay men, an already stigmatized group, formed the AIDS Coalition to Unleash Power (ACT UP). ACT UP engaged in nonviolent protest to bring attention to their cause and worked to correct misinformation regarding the disease and those who were infected with it.

By the year 2000, scientists in the developed world had acquired a sophisticated understanding of AIDS and the human immunodeficiency virus (HIV), and treatments have emerged that make it a manageable rather than a lethal disease, at least in the developed world. But in parts of the developing world, like Sub-Saharan Africa, infection rates were still rising. One difficulty was that HIV infection and AIDS had become associated with homosexuality, which carried stigma and, in some places, even legal penalties that made those infected reluctant to seek help. Addressing transmission with the general public also meant broaching sometimes culturally sensitive topics like sexual intercourse. Those attempting to control the spread of the disease often found themselves trying to influence social and cultural practices, a complicated task fraught with pitfalls.

This does not mean there were no successes. The proliferation of condom use, circumcision, and public information campaigns, along with the declining cost of treatment, have greatly reduced the extent of the epidemic in Africa. However, AIDS is still an enormous and devastating reality for Africans today. Sub-Saharan Africa is home to nearly 70 percent of the world's HIV-positive cases. Women and children are particularly affected; Africa accounts for 92 percent of all cases of infected pregnant women and 90 percent of all infected children.

The Ebola virus has also threatened the health of Africans. The first known outbreak of Ebola, a hemorrhagic fever, took place in Central Africa in 1976. Since then, there have been several other outbreaks. In 2013–2016, an outbreak in West Africa quickly spread across national borders and threatened to become a global epidemic. Approximately ten thousand people fell ill in Liberia alone, and nearly half of those infected died.

The most recent challenge to world health, the COVID-19 pandemic, demonstrates the effects of both globalization and technological developments. The coronavirus SARS-CoV-2 appeared in Wuhan, China, an industrial and commercial hub, in December 2019. Airplane and cruise ship passengers soon unwittingly spread it throughout the world; the first confirmed case in the United States appeared in January 2020. As every continent reported infections, offices, stores, and schools closed, and travel bans appeared. Despite these restrictions, middle-class and wealthy people in the developed world continued almost as normal. Many worked, studied, shopped, visited friends and family, and consulted doctors online from their homes.

Low-paid workers in service industries often lost their jobs, however, as restaurants and hotels closed, and children without access to computers or stable internet connections struggled to keep up with their classes. Even the more fortunate in the developed world confronted shortages of goods from toilet paper to medicines to infant formula when global supply chains stalled as farm laborers, factory workers, dock hands, and railroad employees fell ill, or workplaces closed to prevent the spread of infection. Developing countries lacked funds to support their citizens through prolonged periods of unemployment. Although vaccines were developed in several countries, they were available primarily to people in wealthier nations. As of March 2022, only 1 percent of all vaccine doses administered worldwide had been given to people in low-income countries.

Beyond the Book

Public Art and Modern Pandemics

Dangerous diseases like HIV/AIDS can energize more people than doctors working in laboratories and global leaders publishing reports. During the early years of the HIV/AIDS crisis, grassroots organizers from around the world strove to focus attention on the problem. Their actions were necessary because governments often did little to prevent the spread of the disease or provide treatment for those infected.

The AIDS Coalition to Unleash Power (ACT UP) became known for staging loud protests in public and sometimes private places to raise awareness about the disease. In the United States, the publicity generated through groups like ACT UP forced the government to pay greater attention and to budget more money to the search for a cure. Some artists responded to this movement with murals in well-known locations like the Berlin Wall (Figure 15.21).

A picture of three sections of a concrete wall is shown. A mural of an open mouth is drawn on the larger middle section. It h

Figure 15.21 AIDS Mural. Artists often used the western face of the Berlin Wall to create provocative murals. This one, painted in the 1980s, featured the name of the HIV/AIDS awareness group ACT UP. (The Berlin Wall came down in 1989).

While some murals about diseases were a call to action, especially about HIV/AIDS, others have aimed to educate the public. A mural painted on a wall in Kenya for World Malaria Day 2014 showed viewers the proper use of bed nets to help lower the rate of infection (Figure 15.22).

A mural is on a wall is shown. A large oval is shown in the middle with a wooden double bed. In the bed there is a woman in a

Figure 15.22 Malaria Mural. This 2014 mural in Kenya illustrates how to avoid malaria infection with mosquito nets. The caption reads, "Sleep in a treated net, every night, every season."

During the COVID-19 pandemic, artists also went to the streets. Some of the murals they painted demanded action or celebrated health workers. Others called for awareness about the rising number of elderly people dying of the disease (Figure 15.23).

A large wall mural is shown behind some short bushes with trees in the background on both sides. The top and bottom of the mu

Figure 15.23 COVID-19 Mural. In this German mural about the COVID-19 pandemic, the artist shows how the highly communicable disease separated generations while also highlights the vulnerability of the elderly. The caption reads, "Family is everything."

  • What makes art a powerful medium for conveying messages about awareness? What aspects of these murals seem especially powerful to you?
  • Do you recall seeing artwork from the COVID-19 pandemic or any other disease outbreak? What stood out in it?
  • What other art forms might an artist use to communicate political or social messages? How are these methods effective?

Source: OpenStax, https://openstax.org/books/world-history-volume-2/pages/15-3-science-and-technology-for-todays-world
Creative Commons License This work is licensed under a Creative Commons Attribution 4.0 License.

Last modified: Tuesday, October 31, 2023, 10:46 AM