
Examine the fundamentals of computers and how they are used to provide useful services.
The goal of this course is to teach you to think like a computer scientist. This way of thinking combines some of the best features of mathematics, engineering, natural science, and philosophy. Like mathematicians, computer scientists use formal languages to denote ideas – specifically, computations. Like engineers, they design things, assemble components into systems, and evaluate trade-offs among alternatives. Like scientists, they observe the behavior of complex systems, form hypotheses, and test predictions. Like philosophers, they create logical constructs that can be carried out by a machine. This is not to deny the value of the arts so that non-practitioners can understand and employ the resulting systems. An important skill for a computer scientist is problem-solving. It involves the ability to detect problems, think creatively about solutions, and express solutions clearly and accurately. As it turns out, the process of learning to program computers is an excellent opportunity to develop and apply problem-solving skills. On one level, you will be learning to write Java programs, a useful skill by itself. On the other hand, you will use programming as a means to an end, that end being the creation of something useful to society.
- Unit 1: Computer Programming
- Unit 2: Variables and Operators
- Unit 3: Input and Output
- Unit 4: Methods and Testing
- Unit 5: Conditionals and Logic
- Unit 6: Loops and Strings
- Unit 7: Arrays and References
- Unit 8: Recursive Methods
- Explain what computers are and what they do;
- Compare software and computers;
- Express human logic in computer programming syntax;
- Describe various variable types and their differences;
- Choose means of performing operations on variables;
- Recognize and repair errors in syntax and logic;
- Explain the ways to get data into and out of a computer;
- Demonstrate Java's basic services;
- Dissect computer programs to reveal subtle errors;
- Convert blocks of oft-used code into reusable methods;
- State the difference between "logical" and "relational";
- Cause programs to vary their activities as the input data changes;
- Create programs that perform repetitive actions;
- Explain how data should be grouped for a given analysis;
- Explain various ways to acquire the values of specific elements within a data grouping; and
- Compare recursive processes to traditional looping;

Explore this detailed survey of computing and programming, with an emphasis on understanding object-orientation and the Java and C++ computer programming languages. We will use history, theory, and practice to deliver lessons that prepare you for a career in computer science.
This course will introduce you to a number of more advanced Computer Science topics, laying a strong foundation for future study and achievement in the discipline. We will begin with a comparison between Java, the programming language used in the previous course, and C++, another popular, industry-standard programming language. We will then discuss the fundamental building blocks of Object-Oriented Programming, reviewing what we have already learned, while familiarizing ourselves with more advanced programming concepts. The remaining course units will be devoted to various topics, including the Standard Template Library, Containers, Exceptions, Recursion, Searching and Sorting, and generic programming. By the end of the class, you will have a solid understanding of Java and C++ programming, as well as a familiarity with the major issues that programmers routinely address in a professional setting.
- Unit 1: The Building Blocks of Object-Oriented Programming
- Unit 2: C++ and Java Differences
- Unit 3: C++ Standard Template Library
- Unit 4: Java Container Library
- Unit 5: Exceptions
- Unit 6: Recursion
- Unit 7: Searching and Sorting
- Explain the common computational elements for creating algorithms;
- Compare and contrast the features of Java and C++;
- Explain the importance of Java Containers and how their basic components are used;
- Explain the importance of the C++ Standard Template Library and how its basic components are used;
- Understand important common algorithms, such as sorting and search;
- Evaluate programs using run-time analysis;
- Explain the drawbacks and benefits of recursion; and
- Solve simple problems by applying computational elements, algorithms, containers, and templates in a programming process, including problem statement, algorithm design, program construction, and solution analysis.

Learn fundamental programming concepts using the Python 3 programming language, a high-level interpreted language that is easy to read and write, with powerful libraries that provide additional functionality.
This course is an introduction to fundamental programming concepts by way of the Python 3 programming language. Python 3 is a high-level interpreted language that has many benefits, including easy-to-read and easy-to-write syntax and powerful libraries that provide additional functionality. Even though Python 3 is a great programming language for beginners, it is also used extensively for practical applications in engineering and data science. This course is intended for people with no or very little prior programming experience. It covers a range of topics, such as data types, control flow, functions, file operations, and object-oriented programming. When you finish this course, you will be able to create Python programs for a variety of applications.
- Unit 1: Introduction to Python 3
- Unit 2: Operators
- Unit 3: Input and Flow Control Statements
- Unit 4: Data
- Unit 5: Functions
- Unit 6: Basic Data Structures II – Tuples, Sets, and Dictionaries
- Unit 7: File Handling
- Unit 8: Regular Expressions
- Unit 9: Exception Handling
- Unit 10: Object-Oriented Programming (OOP)
- Write Python programs within an Integrated Development Environment (IDE);
- Explain and apply the basic variable types int, float, str, and bool;
- Apply operators and operator precedence;
- Write programs that apply control structures, lists, strings, sets, tuples, and dictionaries;
- Write programs that import libraries for a variety of applications, such as random numbers and data visualization;
- Write interactive programs that ask for and use input from the user;
- Write programs that use functions and methods;
- Apply data visualization tools for basic data analysis;
- Translate search terms into regular expressions;
- Handle errors and exceptions in Python programs by using try, except, and finally statements; and
- Use object-oriented programming and object methods to write code that is easy to read and maintain.

Learn the C++ computer programming language, with a focus on syntax for primitive types, control structures, vectors, strings, structs, classes, functions, file I/O, exceptions, and other programming constructs.
In this course, we will learn the mechanics of editing and compiling programs in C++. We will begin with a discussion of the essential elements of C++ programming: variables, loops, expressions, functions, and the string class. Then, we will cover the basics of object-oriented programming: classes, inheritance, templates, exceptions, and file manipulation. We will then review function and class templates and the classes that perform output and input of characters to/from files. This course will also cover namespaces, exception handling, and preprocessor directives. In the last part of the course, we will learn some slightly more sophisticated programming techniques that deal with data structures such as linked lists and binary trees.
- Unit 1: Introduction and Setup
- Unit 2: Structuring Program Code
- Unit 3: Working with Simple Data Structures
- Unit 4: Object-Oriented Programming
- Unit 5: Advanced Concepts
- Compile and execute code written in C++;
- Code using elementary data types and conditional and iteration structures;
- Define and use functions, arrays, struct, unions, and enumerations;
- Write C++ applications using principles of object-oriented programming;
- Write templates and manipulate files;
- Translate simple word problems into C++;
- Perform debugging and fixing of common C++ errors; and
- Manage memory appropriately, including proper allocation/deallocation procedures.

Learn the components of Bitcoin and how they work together to keep Bitcoin's open, decentralized system running. This course will build the foundation you need to use and work with Bitcoin and other cryptocurrencies.
How does Bitcoin work? Why is Bitcoin called a "cryptocurrency"? What cryptography does it use? How is security maintained in a system with no central authority? This course will answer these and many more questions. In this course, you will learn the building blocks that make up the open, decentralized system that is Bitcoin.
We'll start by diving into the cryptographic algorithms used in Bitcoin, and walk through how these tools are used to keep the system secure and running. This course is designed for students with a technical background, including some coding experience. Many of the examples and exercises will require some familiarity with coding to follow along.
When you finish this course, you'll be able to differentiate between public and private keys and understand how they are used in Bitcoin transactions, calculate the hash of a piece of data, and explain why hashing is used in Bitcoin's Proof-of-Work consensus protocol, list the functions of a wallet, describe the utility of nodes on the network, and more. You'll have the foundations necessary for understanding, working with, and building on Bitcoin and other open cryptocurrency systems.
- Unit 1: Introduction to Bitcoin Technology
- Unit 2: Cryptographic Algorithms
- Unit 3: Cryptographic Signatures
- Unit 4: Hashing
- Unit 5: Bitcoin Data
- Unit 6: Bitcoin Nodes and Wallets
- Unit 7: Transactions and Scripting
- Unit 8: Reaching Consensus
- Differentiate between symmetric and asymmetric encryption and between public and private keys;
- Convert data between encoding methods used in Bitcoin;
- Determine the validity of a Bitcoin transaction by evaluating the transaction data and signature;
- Differentiate between encryption algorithms used in Bitcoin and list their contribution to maintaining the system;
- Identify components of the Bitcoin system, such as nodes and wallets, and list their functions; and
- Summarize how the components of Bitcoin work together to keep Bitcoin's open, decentralized system running.

Survey basic abstract data types, their associated algorithms, and how they are implemented. Topics discussed include the structures of stacks, queues, lists, sorting and selection, searching, graphs, and hashing; performance tradeoffs of different implementations; and asymptotic analysis of running time and memory usage.
When we use programming for problem-solving purposes, data must be stored in certain forms, or Data Structures, so that operations on that data will yield a specific type of output. Imagine, for example, that a non-profit is having trouble staying afloat and needs an increase in donations. It decides it wants to keep track of its donors in a program in order to figure out who is contributing and why. You would first need to define the properties that would define those donors: name, address, amount donated, date of donation, and so on. Then, when the non-profit wants to determine how to best reach out to its donors, it can create a model of the average donor that contributes to the non-profit – say, for example, based on the size of the gift and the location – so that it can better determine who is most receptive to its mission. In this case, the size of the gift and the location are the "data" of the donor model. If the non-profit were to use this model, it would identify real donors by first generating an abstract donor. This is an example of using Abstract Data Types. Abstract Data Types both take into account the Data Structure (the way in which data about donors is stored) and provide the necessary operations on that structure. In this course, we will discuss the theoretical and practical aspects of algorithms and Data Structures. We will also learn to implement Data Structures and algorithms in C/C++, analyze those algorithms, and consider both their worst-case complexity and practical efficiency.
- Unit 1: Abstract Data Types and Arrays in C++
- Unit 2: Introduction to Stacks and Queues
- Unit 3: Pointers and References in C++
- Unit 4: Dynamic Memory Allocation
- Unit 5: Linked Stacks, Queues, and Lists
- Unit 6: Algorithm Efficiency
- Unit 7: Searching and Sorting Algorithms
- Unit 8: Hash Tables, Graphs, and Trees
- Discuss the general-purpose nature of Abstract Data Types (ADTs):
- Describe ADTs; and
- Summarize specific data types (SDTs) within the context of ADTs.
- Explain elementary data types within the wider context of ADTs:
- Interpret the three elementary data types that are native to C/C++ (scalars, vectors, multi-dimensional arrays) within the context of ADTs;
- Demonstrate the use of scalars, vectors, and multi-dimensional arrays;
- Show how scalars are used to build vectors; and
- Show how vectors can be used to build multi-dimensional arrays.
- Identify the five basic Composite Data Types (CDTs):
- Define the five basic CDTs (lists, stacks, queues, trees, graphs, hash tables);
- Relate an application's requirements to an appropriate CDT; and
- Implement solutions to applications requiring one or more of the five basic CDTs.
- Examine algorithms to determine their computer-resource utilization:
- Define Big-O analysis;
- Explain why Big-O analysis is important to algorithm design and selection;
- Distinguish between Big-O analysis, counting program steps, and counting lines of executable code; and
- Organize search and sort algorithms according to their Big-O resource utilization growth curve.
- Contrast sequential and binary search techniques:
- Define sequential search and name its primary attributes;
- Define binary search and name its primary attributes; and
- Compare the resource utilization curves of sequential and binary search as data-size increases.

Learn discrete mathematics in a way that combines theory with practicality. Major topics include single-membership sets, mathematical logic, induction, proofs, counting theory, probability, recursion, graphs, trees, and finite-state machines.
This provides a clear, accessible introduction to discrete mathematics that combines theory with practicality. Discrete mathematics describes processes that consist of a sequence of individual steps, as compared to forms of mathematics that describe processes that change in a continuous manner. The major topics we cover in this course are single-membership sets, mathematical logic, induction, and proofs. We will also discuss counting theory, probability, recursion, graphs, trees, and finite-state machines.
Understanding the terms "single-membership" and "discrete" is important as you begin this course. "Single-Membership" refers to something that is grouped within only one set and systems that can be in only one state at a time, at the same hierarchical level. Similarly, "discrete" refers to that which is individually separate and distinct. Each of anything can be in only one set or one state at a time. This is a result of Aristotelian philosophy, which holds that there are only two values of membership, 0 or 1. An answer is either no or yes, false or true, 0% membership or 100% membership, entirely in a set or state, or entirely not. There are no shades of gray. This is much different from Fuzzy Logic (due to Lofti Zadeh), where something can be a member of any set or in any state to some degree or another. Degrees of membership are measured in percentages, and those percentages add to 100%. But, even in Fuzzy Logic (multiple-membership, multiple-state, non-discrete logic), one ultimately comes to a crisp decision so that some specific action is taken, or not. For this course, it is enough to understand the difference between single-state and multi-state logic.
As you progress through the units of this course, you will develop the mathematical foundation necessary for more specialized subjects in computer science, including data structures, algorithms, cryptology, and compiler design. Upon completion of this course, you will have the mathematical know-how required for an in-depth study of the science and technology that is foundational to the computer age.
- Unit 1: Sets, Set Relations, and Set Functions
- Unit 2: Counting Theory
- Unit 3: Mathematical Logic
- Unit 4: Mathematical Induction and Proofs
- Unit 5: Probability
- Unit 6: Recursion
- Unit 7: Graphs
- Unit 8: Trees
- Unit 9: Finite-State Automata
- Formulate solutions for selected problem classes involving single-membership sets, classifiable situations, or identifiable event categories;
- Apply objective mathematical reasoning to systems composed of discrete objects and events;
- Assess the accuracy of probabilistic statements applied to specific situations;
- Assess mathematical proofs claiming to show whether or not a condition Y holds, given a premise X;
- Interpret situations that have a predetermined sequence of actions that depend on a limited sequence of events;
- Describe sequences of events and mathematical results where the next event or result is a function of previous ones;
- Explain systems of states (collections of variables describing a specific reality) that have an initial state (given values for each variable) and conditions for transitioning (changing values) from one state to another;
- Categorize all possible outcomes for a series of events, or all possible collections of a set of objects;
- Diagram hierarchical relationships between individual entities within a given situation; and
- Apply networks of mathematical or system entities as tools in computer science to solve various real-world problems.

Get a broad, foundational introduction to the rapidly evolving field of artificial intelligence by learning how to build intelligent software solutions in today's business applications.
After using a really smart app that produced amazing results within seconds, you must have asked yourself: "How did it do that?" After you take this course, you will be able to start answering that question yourself! This course provides you with the fundamentals of the rapidly evolving field of artificial intelligence. Topics we will cover include: Intelligent Agents Various kinds of machine learning models Search algorithms (including heuristic and uninformed search) Iterative improvement algorithms Game playing, logic and automated reasoning Knowledge bases Natural language processing, including generative AI Reasoning under uncertainty You will need to know how to program in a modern language like Python, C#, or Java, and how to apply libraries that are readily available to apply the concepts you learn.
- Unit 1: What Is Artificial Intelligence?
- Unit 2: Agent-Based Approach to AI
- Unit 3: Machine Learning and Its Importance
- Unit 4: Machine Learning Algorithms
- Unit 5: Problem-Solving Methods in AI
- Unit 6: Search Algorithms
- Unit 7: Iterative Improvement Algorithms
- Unit 8: Game-Playing Models
- Unit 9: Natural Language Processing
- Unit 10: Reasoning Agents
- Analyze the definition of "intelligence", from the Turing test to the four basic orientations of modern AI;
- Analyze the concept of "agents" in contemporary AI business solutions;
- Analyze the different types of agents and their capabilities;
- Describe the different kinds of machine learning algorithms and their significance in building AI business solutions;
- Apply supervised machine learning algorithms and contemporary libraries to build AI business solutions;
- Explain the principles of unsupervised machine learning and reinforcement learning models;
- Apply general AI-based problem-solving methods and their computational characteristics;
- Apply heuristically based search algorithms to improve their optimality;
- Apply natural language processing concepts and techniques with existing libraries for analysis and generative applications;
- Discuss the principles of logical reasoning and reasoning under uncertainty; and
- Explain the basics of two-person, adversarial game-playing;

Learn and master machine learning (ML) concepts, algorithms, and real-world applications while gaining hands-on experience building and evaluating ML models with Python.
This comprehensive course is designed to equip you with a strong foundation in machine learning (ML) through a systematic, step-by-step approach. This course covers the essential principles of supervised and unsupervised learning algorithms, providing a deep understanding of how machine learning models work and how they can be applied in real-world scenarios. You will explore the entire ML workflow, from data collection and preprocessing to model building and evaluation, ensuring you gain practical, hands-on experience at each stage.
Throughout the course, you will master key concepts in data preprocessing, feature engineering, and model evaluation techniques. We will cover a range of core algorithms, including regression, classification, and clustering, as well as evaluation metrics to help you assess model performance and make data-driven decisions. Practical exercises and Python-based implementations will reinforce your understanding and allow you to build predictive models. By the end of the course, you will be equipped to handle complete machine learning projects, from data preparation to evaluation, while ensuring your models are both effective and ethical.
In addition to the technical skills, this course emphasizes the importance of ethical decision-making in AI development. You will explore critical issues like bias, fairness, and accountability in machine learning, learning how to build models that are not only accurate but also responsible and equitable. Whether you want to enhance your career, pursue further studies, or contribute to the growing field of AI, CS207 provides you with the knowledge and skills necessary to create impactful and ethical machine learning systems.
- Unit 1: Introduction to Machine Learning
- Unit 2: Machine Learning Workflow
- Unit 3: Data Preprocessing
- Unit 4: Data Visualization
- Unit 5: Supervised Learning – Regression
- Unit 6: Supervised Learning – Logistic Regression
- Unit 7: Unsupervised Learning – Clustering
- Unit 8: Model Evaluation and Validation
- Unit 9: Practical Implementation of ML Models
- Unit 10: Ethical and Responsible AI
- Explain machine learning concepts, including supervised and unsupervised learning;
- Explain the ML workflow, including data collection, preprocessing, modeling, and evaluation;
- Apply data processing and visualization techniques to prepare data sets, interpret data, and perform feature extraction;
- Implement machine learning models, including regression, classification, and clustering;
- Identify overfitting, underfitting, and other challenges in machine learning models;
- Build end-to-end machine learning projects that include documented workflows and are reproducible;
- Explain the performance of machine learning models using basic metrics; Analyze ethical considerations in machine learning.

Learn data science using the Python programming language by looking at data processing, data analysis, visualization, data mining, and statistical models. By the end of this course, you will be able to implement Python code for these data science topics.
This course attempts to strike a balance between presenting the vast set of methods within the field of data science and Python programming techniques for implementing them. Problem-solving and programming implementation will be emphasized throughout the course. All techniques presented will be introduced using real-world programming examples. A major goal of the course is to ensure that when you finish the course, you will have the programming and conceptual expertise you need to join the field of data science.
Several Python modules, such as pandas, scikit-learn, scipy.stats, and statsmodels, will be introduced that are useful for data analysis, data visualization, and data mining. The course will gradually shift from introductory topics such as a review of Python, matrix operations, and statistics to applications and implementing programs involving data mining, visualization, statistical models, and time series analysis.
- Unit 1: What is Data Science?
- Unit 2: Python for Data Science
- Unit 3: The numpy Module
- Unit 4: Applied Statistics in Python
- Unit 5: The pandas Module
- Unit 6: Visualization
- Unit 7: Data Mining I – Supervised Learning
- Unit 8: Data Mining II – Clustering Techniques
- Unit 9: Data Mining III - Statistical Modeling
- Unit 10: Time Series Analysis
- Use Google Colaboratory notebooks to implement and test Python programs;
- Explain how Python programming is relevant to data science;
- Construct and operate on arrays using the numpy module;
- Apply Python modules for basic statistical computation;
- Construct and operate on dataframes using the pandas module;
- Apply the pandas module to interact with spreadsheet software;
- Implement Python scripts for visualization using arrays and dataframes;
- Apply the scikit-learn module to perform data mining;
- Explain techniques for supervised and unsupervised learning;
- Apply supervised learning techniques;
- Apply unsupervised learning techniques;
- Apply the scikit-learn module to build statistical models;
- Implement Python scripts to perform regression analyses;
- Apply the statsmodels module to build and analyze models for time series analysis; and
- Explain similarities and differences between AR, MA, and ARIMA models.

Learn basic concepts in applied cryptography and see how they are implemented in real-world programs.
This course provides an introduction to the field of applied cryptography. This course aims to balance theory, application, and implementation for those new to the field. Topics range from classical techniques involving symmetric and public key cryptography to more immediate topics such as blockchain, zero-knowledge proofs, and quantum cryptography. A rudimentary background in Python and a measure of comfort with basic math concepts are assumed for coding implementations.
- Unit 1: Introduction to Cryptography
- Unit 2: Hash Functions
- Unit 3: Symmetric Encryption
- Unit 4: Asymmetric Encryption
- Unit 5: Signatures and Certificates
- Unit 6: Key Management
- Unit 7: Elliptic Curve Cryptography
- Unit 8: Zero Knowledge Proofs
- Unit 9: Quantum Key Distribution
- Unit 10: Quantum Algorithms
- Explain the difference between symmetric and asymmetric cryptography;
- Solve problems involving cryptographic applications;
- Compose programs that implement and apply hash functions;
- Create programs that implement and apply cryptographic techniques;
- Analyze architectures for signatures, certificates, and authentication;
- Explain the security benefits of quantum key distribution;
- Explain quantum algorithms useful for cryptographic applications; and
- Construct applications that show an understanding of zero-knowledge proofs.

Explore hardware/software components, assembly language, and the functional architecture and design of computers, with a focus on topics like instruction sets, processor arithmetic and control, Von Neumann architecture, pipelining, memory management, storage, and input/output.
Modern computer technology requires an understanding of both hardware and software, since the interaction between the two offers a framework for mastering the fundamentals of computing. The purpose of this course is to cultivate an understanding of modern computing technology through an in-depth study of the interface between hardware and software. In this course, you will study the history of modern computing technology before learning about modern computer architecture and a number of its essential features, including instruction sets, processor arithmetic and control, the Von Neumann architecture, pipelining, memory management, storage, and other input/output topics. The course will conclude with a look at the recent switch from sequential processing to parallel processing by looking at the parallel computing models and their programming implications.
- Unit 1: Introduction to Computer Theory
- Unit 2: Instructions: Hardware Language
- Unit 3: Fundamentals of Digital Logic Design
- Unit 4: Computer Arithmetic
- Unit 5: Designing a Processor
- Unit 6: The Memory Hierarchy
- Unit 7: Storage and I/O
- Unit 8: Parallel Processing
- Unit 9: Look Back and Look Ahead
- Identify important advances that have taken place in the history of modern computing, and discuss some of the latest trends in the computing industry;
- Explain how programs written in high-level programming languages, such as C or Java, can be translated into the language of the hardware;
- Describe the interface between hardware and software, and explain how software instructs hardware to accomplish desired functions;
- Explain the process of carrying out sequential logic design;
- Explain computer arithmetic hardware blocks and floating-point representation;
- Explain how a hardware programming language is executed on hardware and how hardware and software design affect performance;
- Explain the factors that determine the performance of a program;
- Explain the techniques that designers use to improve the performance of programs running on hardware;
- Explain the importance of memory hierarchy in computer design, and explain how memory design impacts overall hardware performance;
- Describe storage and I/O devices, their performance measurement, and redundant array of inexpensive disks (more commonly referred to by the acronym RAID) technology; and
- Identify the reasons for and the consequences of the recent switch from sequential processing to parallel processing in hardware manufacture, and explain the basics of parallel programming.

Learn how to apply an engineering approach to computer software development by focusing on software principles, lifecycle models, requirements and specifications, architecture and conceptual model design, detailed design, implementation, validation and verification, quality assurance, configuration control, project management, tools, and environments.
Software engineering is a discipline that allows us to apply engineering and computer science concepts in developing and maintaining reliable, usable, and dependable software. The software engineering concept was discussed at Germany's 1968 NATO Science Committee meeting. In 1984, Carnegie Mellon University won a contract to establish a government research and development center to transition processes, methods, tools, and frameworks to address the challenges of software cost and quality in meeting customer needs. There are several areas to focus on within software engineering, such as design, development, testing, maintenance, and management. Software development outside the classroom is complex because real-world software is much larger, widely distributed worldwide, and faces cybersecurity threats.
This course aims to present software engineering as a body of knowledge. The course presents software engineering concepts and principles parallel to the software development life cycle. The course will begin by introducing software engineering, defining this body of knowledge, and discussing the main methodologies of software engineering. You will then learn about the Software Development Life Cycle (SDLC) framework and its major methodologies, followed by software modeling using the Unified Modeling Language (UML), a standardized general-purpose modeling language used to create visual models of object-oriented software. You will learn about the SDLC's major phases: analysis, design, coding/implementation, and testing. You will also learn about project management to deliver high-quality software that satisfies customer needs and stays within budget. By completing the course, you will master software engineering concepts, principles, and essential processes of the SDLC. Using UML, you will demonstrate this knowledge by creating artifacts for requirements gathering, analysis, and design phases.
Software Engineering is a highly process-oriented discipline, including many technical and management activities performed by computer hardware, software, or people. In general, a process is a description of the tasks to be performed to complete an activity. Suppose a process needs more detail for hardware, software, or humans to perform the activity (the process is not enactable). In that case, it must have an associated procedure that describes "how" the tasks are enacted. In general, a procedure describes "how" a process is enacted. Processes and procedures also specify "who" enacts them (the roles) and provide context information, such as "why", "when", and "where" the activities are performed. Lastly, this course uses several important paradigms. A paradigm is a perspective, pattern, or model that helps describe a discipline. Two important paradigms for this course are "life cycle", used to describe the development of a system, and "language", used to explain processes and procedures. We communicate via a language that has nouns and verbs. Nouns represent roles ("who" and "what") and places ("where"); verbs represent activities, processes, and procedures. To learn the language, we need to learn its terms, the relationships of the terms, and its grammar. The language paradigm is used in explaining object-oriented design, modeling languages, and teaching programming languages.
- Unit 1: Introduction to Software Engineering
- Unit 2: The Software Development Life Cycle
- Unit 3: Software Modeling
- Unit 4: Software Requirements Gathering
- Unit 5: Software Requirements Analysis
- Unit 6: Software Design
- Unit 7: Object-Oriented Implementations
- Unit 8: Software Testing
- Unit 9: Project Management
- Unit 10: Design Modification and Quality Control
- Describe the knowledge and skills necessary to practice software engineering and the professional issues that a software engineer might face;
- Use software engineering principles such as separation of concerns, abstraction, and incremental development to develop reliable software;
- Differentiate between software development processes and methods;
- Create major activities and key deliverables in a software development life cycle, such as use case, class, and sequence diagrams;
- Create UML diagrams for software analysis and design by using the object-oriented methodology;
- Use project management concepts to manage projects, people, and products; and
- Use software engineering concepts to construct quality software systems.

Examine how operating systems and design have evolved as changes in hardware and software led to contemporary operating systems. Topics include basic OS concepts, methods of OS design and construction, process coordination, management, and algorithms for CPU scheduling, memory, and general resource allocation.
This course will introduce you to modern operating systems. We will focus on UNIX-based operating systems, though we will also learn about alternative operating systems, including Windows. The course will begin with an overview of the structure of modern operating systems. Over the course of the subsequent units, we will discuss the history of modern computers, analyze in detail each of the major components of an operating system (from processes to threads), and explore more advanced topics in the field, including memory management and file input/output. The class will conclude with a discussion of various system-related security issues.
- Unit 1: Introduction to Operating Systems
- Unit 2: Processes and Threads
- Unit 3: Synchronization
- Unit 4: CPU Scheduling
- Unit 5: Deadlock
- Unit 6: Memory Management
- Unit 7: File System
- Unit 8: Security
- Unit 9: Networking
- Understand the functions, structures, and evolution of operating systems;
- Evaluate design choices and trade-offs in the implementation of different operating systems, and decide which type of operating system would be most appropriate for a given situation;
- Evaluate, debug, and modify operating system code to achieve the desired functionality; and
- Explain data structures, algorithms, computer architecture, and programming in the context of operating systems.

Explore the hardware, software, and architectural components involved in computer communications in local area networks by reviewing the basics of computer networks, switching, routing, protocols, and security.
The Internet has become one of the most important components of our lives. We browse the Web, check e-mails, make VoIP phone calls, and have video conferences via computers. These applications are made possible by networking computers together, and this complex network of computers is usually referred to as the Internet. This course is designed to give you a clear understanding of how networks, from in-home local area networks, or LANS, to the massive and global Internet, are built and how they allow us to use computers to share information and communicate.
Unit 1 introduces you to an explanation of computer networks and some basic terminology fundamental to understanding computer networks. You will also familiarize yourself with the concept of layers, which compose the framework around which networks are built. Next, Unit 2 explains the concept of protocols. A computer communication (or network) protocol defines rules and conventions for communication between network devices.
The rest of the course implements a top-down approach to teach you the details about each layer and the relevant protocols used in computer networks. Beginning in Unit 3, you will explore the concept of application layer protocols, which include the Domain Name System, e-mail protocols, and the Hypertext Transfer Protocol. Unit 3 ends with an overview of how to use socket programming to develop network applications. In Unit 4, you will learn transport layer protocols, including the Transmission Control Protocol (TCP) and the User Datagram Protocol (UDP). You will go on to study the network layer Internet Protocol (IP) and packet routing protocols in Unit 5. Next is Unit 6, devoted to a discussion on link layer protocols, and the course concludes with an overview of voice and video protocols, network security, and cloud computing in Unit 7. As you move through the course, notice how the layers build on top of one another and work together to create the amazing tool of computer networks, which many of us depend upon daily.
- Unit 1: Networking Fundamentals
- Unit 2: The Basics of Protocols
- Unit 3: The Application Layer
- Unit 4: The Transport Layer (TCP/UDP)
- Unit 5: The Network Layer
- Unit 6: The Link Layer
- Unit 7: Multimedia, Security, and Cloud Computation over the Internet
- Describe the architecture of a computer network and how each device in a network communicates with each other;
- Compare the basic network protocols in each layer of a transmission control protocol/Internet protocol (TCP/IP) stack with its counterpart Open Systems Interconnection (OSI) layer;
- Configure Internet Protocol (IP) addresses and use them in complex computer networks;
- Illustrate the use of subnetting and supernetting to divide a large network into smaller logical subnetworks;
- Compare the different link layer access mechanisms, protocols, and technologies;
- Compare application layer protocols; and
- Apply network techniques to create wide-area networks.

Learn about database architecture and implementation by exploring Structured Query Language (SQL), including topics like file structures and access methods; database modeling, design, and user interface; the components of database management systems; and information storage and retrieval.
Though we may not recognize them in our everyday activities, databases are everywhere. They are hidden behind your online banking profile, airline reservation systems, medical records, and even employment records. This course will provide a general overview of databases, introducing you to database history, modern database systems, the different models used to design a database, and Structured Query Language (SQL), which is the standard language used to access and manipulate databases. Many of the principles of database systems carry over to other areas in computer science, especially operating systems. Databases are often thought of as one of the core computer science topics since many other areas in the discipline have been derived from this area.
- Unit 1: Introduction to Modern Database Systems
- Unit 2: Database Architecture and Date Languages
- Unit 3: Database History
- Unit 4: The Entity-Relationship Model
- Unit 5: The Relational Database Model
- Unit 6: Relational Algebra
- Unit 7: Introduction to Data Normalization
- Unit 8: Introduction to SQL
- Unit 9: Basic Select Statements
- Unit 10: The Join Statement
- Draw a system diagram of a database management system showing its structure and functions;
- Identify the various people involved in database management systems;
- Explain the historical background of database management systems (DBMSes) and relate early DBMS problems and challenges to the current state of DBMS technology;
- Demonstrate the functions of a database management system;
- Develop an entity-relationship model based on user requirements;
- Perform the process of normalization;
- Convert an entity-relationship diagram to a set of normalized relations;
- Explain referential integrity and give an example of relations where it is not satisfied; and
- Use relational algebra to construct queries.

Learn the principles of information security to protect the confidentiality, integrity, and availability of information. Discuss the modes of threats and attacks on information systems, threat mitigation, cryptography, user identification and authentication, access control, privacy laws, and more.
The first network was invented in the late 1960s with the birth of ARPAnet, a project launched by the US Department of Defense (DoD). That network advanced into what is now known as the Internet and has grown into a global phenomenon to become an integral part of our daily lives. The Internet connects the world on a social, business, and governmental level. So much information is stored and transferred online that the Internet has become a target for criminals. Any devices connected to the Internet must be protected from unauthorized disclosure using tools prescribed by the discipline of information security.
This course covers information security principles, an area of study that engages in protecting the confidentiality, integrity, and availability of information. Information security continues to grow with advancements in technology – as technology advances, so do threats, attacks, and our efforts to mitigate them. In this course, we discuss the modes of threats and attacks on information systems. We also discuss an important area of threat mitigation that saw rapid development in the twentieth century: cryptography. Information security is concerned with user identification and authentication, and access control based on individual or group privileges. The basic access control models and the fundamentals of identification and authentication methods are included in this course.
Without networks, our focus would primarily be on controlling unauthorized physical access. Instead, networks are the way we keep data in motion, making information security a more complex task. We discuss methods to design secure networks using firewalls, tunneling, and encryption, and we describe some tools to secure networks, such as honeypots, network sniffers, and packet capturing. Operating systems that connect to a network must be hardened to prevent unauthorized disclosure. Methods and tools such as patching, logging, antivirus, and antimalware tools are discussed.
The last topic in this course is global privacy laws. When unauthorized disclosure or a breach of information occurs, there are adverse effects and penalties placed on individuals or organizations, depending on the area of jurisdiction. Laws are diverse and vary greatly throughout the world, and we are still trying to develop laws that will protect privacy globally.
In this course, you will learn the fundamentals of information security, security threats, modes of attack, and cryptographic models. Access control, identification, and authentication are also addressed. Network security and operating system (OS) hardening are explained along with intrusion detection and prevention. The course concludes with global privacy laws.
- Unit 1: Introduction to Information Security
- Unit 2: Threats and Attack Modes
- Unit 3: Cryptographic Models
- Unit 4: Access Control
- Unit 5: Identification and Authentication
- Unit 6: Network Security
- Unit 7: Operating System (OS) Security
- Unit 8: Intrusion Detection and Prevention Systems
- Unit 9: Privacy Laws, Penalties, and Privacy Issues
- Explain the fundamental principles of information security;
- Identify major information security threats and their modes of attack;
- Describe cryptographic models and how they are used to provide security;
- Explain the principles of access control models such as discretionary, mandatory, role, and rule-based;
- Illustrate methods of identification and authentication such as passwords, pins, biometrics, and tokens;
- Describe network security methods including network designs, firewalls, wireless encryption methods, tunneling, and network protection tools;
- Describe operating system (OS) hardening, malware protection, firewalls, and security tools;
- Explain intrusion detection systems (IDS) and intrusion prevention systems (IPS), and the advantages and disadvantages of each system; and
- Synthesize current privacy laws and the implications of violations on organizations.