CS101 Study Guide

Unit 1: Introduction

1a. Describe the history of software and computer design

  • Name a major historic trend in computer electronic circuitry.
  • What is the central historic trend in how computers are physically built?
  • What is one of the important algorithm-execution trends as computers evolved during their long history?
  • What was the first commercially-available computer programming language?

The history of computers really started in the days of ancient Greece when Aristotle (384 BC–322 BC) laid the philosophical foundation for the machines we use today. That thought process first led to mechanical and then to electronic machines. Within electronics, the trend was from vacuum tubes to transistors to integrated circuits. Those have become far more dense as time has gone on so that now a computer that once needed an entire building to support it can now fit easily into a handheld device. "Computer", taken in a general sense, can refer to the counting-sticks of prehistory and the beads of the abacus. Mankind has needed the ability to count, calculate, and organize data during most of its existence.

In the mid-1800s the first computer program was published by Lovelace, who worked with Babbage. Babbage had designed a computer that could not be fully implemented because of funding and a lack of suitable materials. That same machine was built in recent times according to his plans using modern materials. It worked exactly as originally designed.

As with Babbage's machine, programming languages have evolved as well. Some could not be implemented because of a lack of devices at the time. However, those early efforts began a process whose most important attribute is a gradual transition from machine-oriented to human-oriented languages. With that transition came the tools that enable the creation of systems having greater and greater complexity.

Review:

1b. Explain formal logic and how it is a subset of human logic

  • What is "formal logic"?
  • What type of thinking does formal logic entail?
  • What is the difference between formal logic and human logic?
  • How does Boolean algebra relate to formal logic?

Formal logic, a philosophy developed by Aristotle of ancient Greece (384 BC - 322 BC), admits to only one conclusion that is completely incompatible with any other. For instance: On/Off, In/Out, Yes/No, Black/White. Relating for a moment to set theory (Set Theory), nothing can be a member of more than one set at the same time.

Human logic, the way humans actually think, admits to an infinite number of conclusions. One can come to a conclusion that contains many components. Something can be a member of an infinite number of sets, to one degree or another, at the same time. A good example is color (Color Models: RGB, HSV, HSL). Using RGB notation, color can be described by three sets. Each set contains 255 possible members. For each set, we choose one of its 255 members to get a particular color. Other notations use a triplet of fractions that can be taken to any number of decimal places. Yet, computer hardware requires other notations to be translated to RGB. Therein is a serious limitation relative to the visual ability of humans. 

Thus, it is easy to see that formal logic, "binary thinking", is a small subset of human logic. Computers are simply boxes filled with on/off switches. Their operation can be described by formal logic. Their supporting mathematics can be described by Boolean Algebra (The Boolean Algebra of Sets), the formal syntax, and symbolic manipulation attending formal logic. This holds regardless of the sophistication of computers' electronics. Therefore, computers cannot be made to fully represent human thinking and abilities, by any stretch of the imagination. The role of the computer scientist is to make these highly-limited machines useful in a human-centric reality.

Review: 

  • An Introduction to Formal Logic
    • subsection on Logical Truth starting on pg. 7 
    • subsection 1.3 on two ways arguments can go wrong, starting on pg. 4. It is important to know when an argument is right or wrong, just as much as it is important when one is hearing, or not hearing, the truth.
  • A Concise Introduction to Logic: Read the introductory paragraph that appears immediately under the title.

1c. Describe the relationship between human logic, computers, and computer languages

  • What is the first step in writing a computer program?
  • How does the human-written program become something the computer can employ?
  • What are the limitations of the computer relative to mathematics?

How do humans write computer programs? The very first step is to have a vision of WHAT the computer needs to do. Then, that vision has to be articulated precisely in human language. This exposition can include text, figures, diagrams, tables, and other illustration tools. If you cannot articulate in human language what the computer is to do then it is not possible to write a corresponding computer program successfully. Management theory postulates a similar idea: no goal leads to no plan leads to no success.

By having a human-oriented expression of what the computer is to do, the human expression can be translated into computer-language syntax. We have already seen that there are many computer languages, all of which have the same basic capabilities. The most modern languages are better at facilitating the construction of complex systems, but, in the end, it is the human expression of what has to happen that gets translated into the syntax of some computer language. (This does not deny that modern systems are often composed of numerous modules, each of which may be written in a different computer language.)

Once the HOW of computer languages is employed, we have to take account of the computer's nature, its underlying essence. Computers, being finite binary machines, represent numbers in Base 2, rather than Base 10, as humans do. Often we see Base 16, hexadecimal, or Base 8, octal, notation. But, in the end, that becomes Base 2 within the computer. As such, and also due to the finite nature of computers, it is not possible for them to represent all fractions, very large numbers, or very small numbers with total accuracy. Whereas computer limitations make their accuracy inherently limited, mathematics itself is of infinite accuracy. Even an elementary calculation such as 1/3 is beyond the computer's ability to represent exactly. 

The computer's binary nature and its inherent inaccuracy must be taken into account when translating any process for computer implementation. Especially iterative programs, those that loop through a process as an answer is resolved, must take into account error build-up as operations proceed. Computers can be useful only when computer programs are written with careful consideration.

As you review this module, keep in mind that it is not so much the particular languages that matter by name. Rather, the reader should understand the evolution of computer languages. The key is the evolution from machine-centric to human-centric and the trend to increasing modularization. The trend toward building computer programs from pre-existing and trusted modules is also associated with modularization.

Review The History of Programming Languages. Focus on seeing the big picture of language evolution rather than trying to memorize particular language names and when they were created.


1d. Describe the basic computer model

  • What are the three major activities of a computer system?
  • What are the two major components of a computer system?
  • What mechanism allows computers to communicate with each other?
  • What are the two types of computer programs?

A computer is a very handy tool, a black-box that humans can program to carry out a sequence of steps. It has three basic capabilities: 1) input data, 2) process data, 3) output processing results. The computer does not invent the steps by itself. Even so-called "adaptive" systems still work according to human specifications. 

Lest we take "sequence of steps" literally, take note of the fact that a single core within a CPU can truly only do one thing at a time. But, modern CPUs contain multiple cores, each of which can operate independently. So, a single computer can actually be doing more than one thing at a time. Even peripherals such as printers and secondary memory (permanent data storage) can be operating while a CPU's cores continue with other work. Thus, a computer's throughput, how much it can get done within a given block of time, is greatly expanded well beyond "one thing at a time".

Parallel processing uses the same computer memory so that a single program can use threads to allow its modules to function independently but in coordination. Unrelated programs can also be running on other cores and still coordinating, but they are within the same physical memory, although not in the same program space. The operating system manages all this and distributes the workload according to the number of cores and memory available. Therein may lie a bit of trouble because a computer can only handle so much workload before its internal management overcomes any benefit of having multiple things going on at the same time.

An opportunity to further increase throughput (workload over time) beyond what one computer can handle is presented by distributed processing. Using networks, multiple heterogeneous computers can communicate and coordinate. Totally independent programs, not sharing the same memory or program space, can still exchange data and cooperate. It is also possible for different computers to be running the same program but with different data, while another computer consolidates results. Thus, here again, throughput can be improved. The modern world has shown that the exchange of data can yield many benefits, not only for industry but also for study, entertainment, and personal reflection.

As a general-purpose tool for human benefit, the computer can be quite powerful. Put it to good use.

Review:

  • Introduction to Computer Systems: Review the paragraphs associated with the table of contents. Together, these give a nice introduction. Do not let yourself become confused by references to storybooks you are not familiar with. Those references appear mainly in the first paragraph of each section and are not essential.
  • The Processor: Review the paragraphs associated with the table of contents. Each set of notes is very brief.
  • Introduction to Number Systems and Binary: This is an excellent video for understanding the basics of number systems. It is only 10 minutes long.

1e. Explain the programming life cycle

  • What is the very first step one should take when beginning a software development project?
  • At what point in a project is it best to catch something that has gone awry?
  • Describe an advantage of the waterfall model of software development projects.
  • Is software development a purely sequential activity?

We must always prioritize WHAT the computer is to be made to do. And what it is to be made NOT to do. It is seriously insufficient to only know HOW to make the computer do things. That approach leads practitioners into the trap of looking for nails to bang on with whatever hammer practitioners happen to have. Claiming that the hammer a practitioner happens to have is a universal tool for all needs is an even worse approach. If we are to provide value, we have to first discover what needs to be accomplished. Once that is established, we can discover how to best bring a solution to those needs.

Notice that the paragraph above puts computers and languages, hardware and software, midway on the list of activities attendant on projects intended to provide satisfying solutions. There are seven general cyclical steps in a software project. Things related to writing software should be left to Step D and later.

  1. Analyze
  2. Plan
  3. Design
  4. Implement
  5. Test and Debug
  6. Deploy
  7. Maintain

While a project may start off with these activities taking place one after another, the activities do not stand alone. Each provides information that informs prior and future steps in the process. Any project plan, however, has to clearly differentiate between the various activities so that milestones and progress can be measured. That is true with any process a project may follow. Otherwise, it is not possible to tell when a milestone has been reached or a project has been completed.

Do not put faith in process alone. Many teams get caught up with various process-related activities and forget about delivering a superior product. It is not necessarily the case that the project is successful just because all the process squares have been checked.

Review:

  • The Programming Lifecycle: This is a very short general-purpose reading. The listing of steps is not completely reflected in the diagram. And, the list of steps merges analysis and planning. But one can still get an overall sense of how a software project should proceed.
  • Comparing Waterfall, Unified, and Agile Software Development Processes: See the section at the end, Comparisons. It is interesting to note that even the Waterfall model is not a series of rigid steps with no feed-back or feed-forward. Whatever process is selected, it must be adapted to the project at hand. 

1f. Discuss the history of the Java programming language

  • What is the difference between a compiler and an interpreter?
  • How are compilers and interpreters involved in computer programming?
  • What is an Integrated Development Environment (IDE)? Why is it important?
  • Why is Java important to the evolution of computer languages?

Unless specified by the customer, the choice of computer language should be left until after the design phase of a software development project. Computer languages are simply means of expressing human thought in a way that a computer can do something with it. They are not the main focus of a computer software project. The main focus is customer long-term satisfaction.

For the purposes of this course, Java is the specified computer language. While compilers, combined with linkers, turn source code directly into machine code that is directly executable by a processor, interpreters read the source code and execute it line-by-line as the source code is read. In the raw sense, interpreters keep reading the source code and determining what to do from there, given variable values. In this case, what the processor does is implied by the source code, whereas a compiler/linker creates machine code first for direct execution by the processor.

Java is an object-oriented language that combines a compiler with an interpreter. Source code is compiled into bytecode and the bytecode is executed through a virtual machine that runs on the computer. Each processor type needs a Java Virtual Machine (JVM) written particularly for it. But, the source code is highly portable since the JVM, if standards are followed, always runs the code in exactly the same way, regardless of the computer's word size or processor type. This is in contrast to languages such as C/C++ where, for instance, the number of bits associated with an integer varies with the compiler/linker and the word size of the computer at hand. Therein lies a caution, although Java's syntax is very similar to C/C++, it is not at all the same computer language.

The history of Java is interesting. It originated at Sun Microsystems, a company that ultimately went bankrupt. Its assets were purchased by Oracle, a still-thriving company with a long and successful history. Java continues today. Its standards are guided by a committee of experts but are controlled by Oracle. Java continues as a steady, well-respected, language. Java skills are in high demand. 

Review:


1g. Set up a Java Development Kit (JDK) for Java development

  • What is the difference between JDK and JRE?
  • Where does one go to obtain the JDK?
  • Is JDK's installation process exactly the same for all operating systems?
  • What are two important (Integrated Development Environments) IDEs that support Java programming?

Here is where we start getting our hands dirty. We start by downloading and installing the Java Development Kit (JDK). Following the instructions in the readings, you download the JDK for your type of computer and the operating system it is running. While the JDK should operate the same on any computer, its underlying modules are not the same for all computers and operating systems.

JDK contains several tools. The most important are the Java Compiler (javac) and the Java Runtime Environment (JRE). javac reads the programmer's source code (*.java files) and compiles it into *.class files that contain bytecodes. java reads the *.class files and runs them through the Java Virtual Machine (JVM) to execute the program. You can write your first program using a simple text editor (Notepad, for instance).

While it is possible to write source code using a text editor, you should learn to use an IDE. As your programs become more and more complex, and employ more and more lines of code, modules, and third-party libraries, project timeliness and quality will suffer. Basic tools take increasing effort and time as complexity and lines of code grow. Project schedules and budget can rarely be expanded to accommodate this extra time. So, quality will necessarily suffer. You may well hear some practitioners brag that "I do my programming with a text editor" in an attempt to put themselves above others. But, nobody thinks you are smart if you deliver sloppy work, exceed budget, or cause delay. The demands of timeliness and quality are far too high with modern systems for you to not use appropriate tools.

Review Downloading and Installing JDK: Do not bother with the last section on JDK Versions. At the very bottom of the page is a second section marked JDK Versions. Examine this for installation instructions on your particular computer and for some example code. There is also a discussion on programming tools.


1h. Set up a NetBeans IDE for Java programming

  • Where does one go to download NetBeans?
  • Why is NetBeans important to Java programming?
  • With what other tools does NetBeans integrate?

We talked earlier about why an IDE is important to computer programming. This goes for all languages, not just Java. Besides programming support, NetBeans facilitates the use of MySQL, a popular open-source database management system (DBMS). MySQL is directly compatible with Oracle DBMS so that proofs-of-concept can be readily transitioned to production.

Worldwide accessibility is very important in today's business and industrial environment. This includes data acquisition, storage, management, processing, and delivery. Thus we have "cloud computing". We can make our local computers accessible or we can run our applications on someone else's computer, operating it remotely. Java is particularly good at this type of application. NetBeans greatly facilitates writing such programs.

Review Downloading and Installing NetBeans IDE: There is a brief introduction and then a table of contents. Pay particular attention to Chapters 1 and 2. For now, the rest can be used for reference.


1i. Write and run a simple Java program

  • What is the basic structure of a Java program?
  • What is the difference between Java source code and bytecode?
  • How is it that a simple text editor like Notepad can be used to write Java source code?
  • List the general steps in getting a Java program to compile and execute.

So far, we have studied the "theory" of computers, programming, and the Java programming language. In this section, we put theory into practice. While it is true that theory is important, it has no value unless it has a positive effect on what we actually do. On the other hand, practice without a solid theoretical foundation rarely leads to more than taking shots in the dark. It is important to understand what one is doing and why before actually doing anything.

For very simple programs, using a text editor such as Notepad is sufficient. We can write the program, save it, compile it, and run it, using distinct steps. Problems arise when trying to debug programs that are longer and longer, with increasing complexity. Ultimately, a text editor such as Gedit becomes useful because that tool can highlight the various components of Java source code (and other languages too). Ultimately, an integrated development environment (IDE) becomes essential. Such a tool not only does the same as a sophisticated text editor, but it also highlights code that does not pass the compiler's review. It can also point out places where better programmer practice should be applied. Then, during execution, one can step through the program line by line to discover logical errors and misvalued variables. Programmer mistakes become very easy to spot, as well as faulty third-party modules.

Be sure to take this opportunity to practice, practice, practice by going beyond "Hello World".

Review:

  • Introduction to Java: Sections 3, 4, 5, 6 are for review of previous material. Concentrate on the others so that you go through the entire process of creating, compiling, and running a Java program.
  • Fill in the Blanks - Review: This quiz would be well worth going through to increase your attention to important detail.

1j. Explain how computers are a tool to assist people and describe appropriate and inappropriate uses of computers

  • If we learn we can do something, should we necessarily do it?
  • In what way does innovation cause change?
  • In what way does change cause innovation?
  • Does innovation require advances in technology?

Innovation has been described as the process of making improvements by introducing something new. While short, the construction of the previous sentence is very important. Notice the prominence of the word "improvement". The phrase "something new" is a supporting adjective, not the main point of the definition. In the same way that "change" is not necessarily improvement, neither is "something new". 

"Improvement" itself can not be defined alone. It must be defined relative to an existing situation. Before implementation, the implications and risks of the "improvement" must be known and well understood. Thus, innovation is more than learning how to do new things. It is more than changing what already exists. There must be a goal in mind and the change or innovation must advance toward a positive goal. That advancement must be measurable so that present status can be determined relative to the goal.

Thus, innovators are far more than "change agents". They are those who enable the present situation to advance toward a better future. 

Review:


Unit 1 Vocabulary

This vocabulary list includes the terms listed above that you will need to know to successfully complete the final exam.

  • abacus
  • Aristotle
  • Babbage
  • Base 2 numeric notation
  • Base 10 numeric notation
  • binary thinking
  • Boolean Algebra
  • bytecode
  • C/C++
  • compilers
  • formal logic
  • hardware
  • human logic
  • integrated development environment (IDE)
  • innovation
  • interpreters
  • Java
  • Java Development Kit (JDK)
  • Java Runtime Environment (JRE)
  • linkers
  • Lovelace
  • machine code
  • memory
  • NetBeans
  • networks
  • operating system
  • processor
  • program
  • programming languages
  • set theory
  • software
  • source code
  • word size