Share
Explore

CSD-1133: Problem Solving & Program Logic Lesson Plan

megaphone

High-Level Summary of this Course:

Welcome, Veerpal! Here's a roadmap of how we’ll progress from the basics of computer science (bits and bytes) to building advanced applications like an AI model, a SOAP microservices web application using Node.js, and Python-based Excel data analytics.
The journey is designed to build your interest and skills and get you ready for the job interview!

Step 1: Foundations - Bits, Bytes, and Building Blocks

What You’ll Learn:
The basics of computation: bits, bytes, logic gates, and how they form the foundation of digital computers.
Hands-on projects like building a 2-bit adder to understand how arithmetic operations work at the hardware level.
Why It Matters:
This gives you a concrete understanding of how computers think and calculate.
You'll see how this scales to more complex operations in CPUs and modern programming languages.

Step 2: Programming and Problem Solving

What You’ll Learn:
Core programming constructs (variables, loops, functions) in Python and JavaScript.
Problem-solving techniques and flowcharting to break down tasks systematically.
Hands-on labs with small projects to reinforce programming concepts.
Why It Matters:
These skills are essential for working with real-world data and applications.
Python’s simplicity makes it ideal for data analysis, while JavaScript introduces you to web-based programming.

Step 3: Applied Systems - Building Hardware and Software Interactions

What You’ll Learn:
Control simple systems like Arduino or Raspberry Pi robot cars using JavaScript.
Explore how software translates into hardware actions (e.g., moving a robot, reading sensors).
Why It Matters:
Connecting programming to physical systems shows you the practical side of computing.
These projects demonstrate how abstract code can control tangible devices, bridging theory and reality.

Step 4: AI and Data Analytics with Python

What You’ll Learn:
Understand the basics of artificial intelligence and machine learning models.
Use Python libraries like Pandas and NumPy to analyze Excel data.
Build small AI models to perform tasks like predictions or classifications.
Why It Matters:
Python is the go-to language for AI and data analytics, skills that are in high demand in today’s job market.
You'll gain insights into how data drives decisions in businesses and technology.

Step 5: Web Application Development

What You’ll Learn:
Develop a SOAP microservices web application using Node.js.
Understand how web services communicate and how to process requests/responses.
Learn about APIs and how they connect different software systems.
Using Python and AI in Excel Data analytics.
The culmination of our journey will be to build your own AI Model hosted on
Why It Matters:
Web services power most modern applications, from e-commerce sites to cloud platforms.
Node.js gives you hands-on experience in creating scalable, server-side applications.

How These Steps Fit Together

From Bits to Applications:
Starting with logic gates and adders teaches you the physical foundation of computation.
Progressing through programming and hardware control bridges the gap between basic computation and real-world applications.
AI and web applications demonstrate the advanced capabilities of modern computing, all built on the principles you’ll learn.

Your Journey

This personalized course is designed to be engaging, practical, and comprehensive, ensuring you gain confidence and expertise.
By the end, you’ll:
Understand the fundamentals of how computers operate.
Be skilled in programming and data analysis with Python.
Develop real-world web applications with Node.js.
Appreciate the elegance of computing and be equipped to tackle challenges in both technology and problem-solving.
Let’s make this journey both exciting and successful! 🚀
This course, "Problem Solving & Program Logic," embarks on a historical journey through the milestones of computer development, providing a rich foundation for understanding modern programming concepts.
Students will explore the evolution of computers by engaging in creative simulations and DIY projects that bring history to life.
From simulating the ENIAC, one of the earliest computers, using transistors on a whiteboard, to leveraging Raspberry Pi to "go to the metal" and gain insights into low-level programming, the course is designed to provide hands-on experiences that bridge historical developments with contemporary applications.
Students will have the opportunity to recreate and experiment with the milestones of computer history, such as early operating systems and language interpreters.
This approach not only enriches the learning experience by providing context to algorithm and logic development but also enhances problem-solving skills by encouraging a deeper understanding of how these foundational technologies have influenced modern programming paradigms.
Through these activities, the course aims to cultivate an appreciation for the iterative nature of technological advancement and inspire innovative thinking within the realm of program logic and design.

This curriculum follows a case study approach by:
Using historical context as a framework (vacuum tube to modern AI)
Focusing on language-independent concepts
Incorporating weekly case studies
Building progressively from basic to advanced concepts
Emphasizing problem-solving and program logic
Using pseudocode and flowcharting as primary tools
### Course Overview This language-independent course introduces fundamental programming concepts through a historical lens, using the evolution of computers as a framework. Students will learn problem-solving and program logic through hands-on exercises, pseudocode, and flowcharting, building from basic computing concepts to modern implementations.
### Learning Objectives By the end of this course, students will be able to: 1. Design, test, and debug programs using a top-down modernized approach 2. Control program flow using decision and repetitive structures 3. Implement array processing techniques 4. Develop programs that process data from files 5. Create software solutions using pseudocode and flowchart tools 6. Apply structured programming techniques to solve problems
### Weekly Breakdown
#### Week 1: Introduction to Problem Solving and Computing History - Course introduction and overview - Historical context: Vacuum tube computers - Problem-solving methodology - Introduction to pseudocode and flowcharting - Lab: Creating basic flowcharts for simple problems - Case Study: ENIAC and early problem-solving approaches
#### Week 2: Basic Program Structure and Logic - Program structure fundamentals - Variables and data types - Basic input/output operations - Sequence structure - Lab: Writing pseudocode for linear programs - Case Study: Evolution from vacuum tubes to transistors
#### Week 3: Decision Structures - Boolean logic and conditions - IF-THEN-ELSE structures - CASE/SWITCH structures - Lab: Flowcharting decision structures - Case Study: Early decision-making in computer programs
#### Week 4: Repetitive Structures I - Introduction to loops - While loops - Do-While loops - Lab: Creating flowcharts and pseudocode for iterative processes - Case Study: Early batch processing systems
#### Week 5: Repetitive Structures II - For loops - Nested loops - Loop control statements - Lab: Solving problems using multiple loop types - Case Study: Evolution of program control structures
#### Week 6: Arrays and Data Structures I - Introduction to arrays - One-dimensional arrays - Array operations - Lab: Array manipulation exercises - Case Study: Early data storage solutions
#### Week 7: Arrays and Data Structures II - Multi-dimensional arrays - Array searching - Array sorting - Lab: Implementing basic sorting algorithms - Case Study: Development of memory systems
#### Week 8: Midterm Project - Comprehensive problem-solving project - Implementation of multiple concepts - Documentation requirements - Project presentation - Case Study: Integration of concepts through historical perspective
#### Week 9: File Processing I - Introduction to file operations - Sequential file access - Error handling - Lab: Basic file operations - Case Study: Evolution of storage systems
#### Week 10: File Processing II - Random access files - File update operations - Data validation - Lab: Advanced file handling - Case Study: Modern storage solutions
#### Week 11: Functions and Modular Programming I - Function concepts - Parameter passing - Return values - Lab: Creating and using functions - Case Study: Development of modular programming
#### Week 12: Functions and Modular Programming II - Function libraries - Scope and lifetime - Program organization - Lab: Building a function library - Case Study: Modern software development practices
#### Week 13: Advanced Problem-Solving Techniques - Problem decomposition - Algorithm efficiency - Documentation standards - Lab: Complex problem-solving exercises - Case Study: Modern AI applications
#### Week 14: Final Project and Course Wrap-up - Final project implementation - Course review - Modern programming concepts - Future trends discussion - Case Study: From vacuum tubes to modern AI
### Assessment Structure - Weekly Labs: 30% - Midterm Project: 25% - Final Project: 35% - Participation: 10%
### Teaching Methodology - Historical case studies each week - Hands-on labs with pseudocode and flowcharting - Progressive building of concepts - Language-independent approach - Focus on problem-solving techniques - Integration of historical context with modern applications
### Required Materials - Flowcharting software - Pseudocode development environment - Case study materials (provided) - Lab worksheets (provided)
### Additional Resources - Historical computing references - Modern programming examples - Online problem-solving tools - Supplementary practice exercises

error

"Circuitry & Code: A Journey Through Time in the Digital Revolution"

Embark on an engaging and enlightening journey through the history of computational hardware, software, and thought—beginning in the 1930s and traversing to the present day.
This narrative Is a travelogue, inviting readers to explore pivotal moments in technology development as if they are traveling through time.
1. Start in the 1930s, introducing readers to the earliest mechanical computers, such as the Zuse Z1 and the pioneering ideas of Alan Turing.
2. Move through the wartime era, highlighting the influence of machines like the ENIAC and Colossus, which laid the groundwork for electronic computation.
3. Guide readers through the 1950s and 1960s, focusing on the transition from vacuum tubes to transistors and the creation of foundational programming languages like FORTRAN and COBOL. Illustrate the impact of these innovations on scientific and business computing.
4. Journey into the 1970s and 1980s, a time of significant innovation with the development of the microprocessor, personal computing, and software evolution, exemplified by milestones such as the IBM PC and the Apple Macintosh.
5. Explore the 1990s, with its explosive growth of the Internet and the initiation of open-source software movements, detailing how these advancements democratized technology.
6. Move into the early 2000s, chronicling the rise of mobile computing and cloud technologies, and emphasize the shifts in software design philosophies driven by these new platforms.
7. Conclude in the present day, examining the emergence of artificial intelligence, quantum computing, and the ongoing influence of computational thinking in modern society.
Throughout this travelogue, incorporate personal anecdotes from key figures, insights on how each technological advancement influenced society, and reflections on future possibilities. Use vivid descriptions and engaging storytelling to capture the spirit of discovery and innovation that defines the evolution of computation.

Learning Outcomes:
The concepts of information science, craft a simple lab to teach them the most fundemental blocks of informations by building a series of simple logic gates demonstrating adding numbers using nand gates with push buttons as inputs and leds as outputs.

Here's a simple lab exercise for students to learn fundamental concepts of information science through the creation of logic gates and a basic adder circuit using NAND gates:

Lab Exercise: Building a Binary Adder with NAND Gates

Objective

Introduce students to the basic building blocks of digital circuits by constructing simple logic gates using NAND gates and combining them to create a binary adder.

Materials

Breadboard
NAND gate IC (e.g., 7400 series)
Push-button switches (2 per input)
LEDs (to represent output states)
Resistors (220Ω for LEDs, 10kΩ for pull-down resistors)
Jumper wires
Power supply (5V)

Background

Digital circuits are built using logic gates. The NAND gate is a universal gate, meaning all other gates can be built from it. In this lab, we will:
Construct basic logic gates (AND, OR, NOT) using only NAND gates.
Use these gates to build a 1-bit half adder.
Combine two half adders to create a 1-bit full adder.

Steps

Part 1: Constructing Basic Gates with NAND

NOT Gate:
Connect both inputs of a NAND gate to the same push-button.
Connect the output to an LED through a resistor.
Pressing the button should invert the signal (light off when pressed, on otherwise).
AND Gate:
Use two NAND gates. Connect the output of the first gate to both inputs of the second.
Input signals (push-buttons) go to the first NAND gate's inputs.
The output of the second gate is the AND operation.
OR Gate:
Combine three NAND gates.
Invert the inputs using two separate NAND gates (NOT gates).
Feed the inverted inputs into a third NAND gate.

Part 2: Building a Half Adder

A half adder adds two binary numbers (A and B) and outputs:
Sum (S): S=A⊕BS = A \oplus BS=A⊕B (XOR operation)
Carry (C): C=A⋅BC = A \cdot BC=A⋅B (AND operation)
Implement the XOR gate using 4 NAND gates:
XOR = (A⋅NOT B)+(NOT A⋅B)(A \cdot \text{NOT } B) + (\text{NOT } A \cdot B)(A⋅NOT B)+(NOT A⋅B)
Connect the AND gate (already built in Part 1) for the Carry output.

Part 3: Building a Full Adder

A full adder combines two binary numbers with an input carry:
Inputs: A, B, Carry-in (Cin)
Outputs: Sum, Carry-out (Cout)
Use two half adders:
First half adder: Combine A and B.
Second half adder: Combine the Sum from the first half adder with Cin.
Use an OR gate for the Carry-out:
Cout=(A⋅B)+(Sum1⋅Cin)C_{out} = (A \cdot B) + (\text{Sum}_{1} \cdot Cin)Cout​=(A⋅B)+(Sum1​⋅Cin).

Testing

Connect push-buttons to the inputs (A, B, Cin).
Observe LEDs for the Sum and Carry outputs.
Test all combinations of inputs and verify against the expected results.

Questions for Students

What is the significance of NAND as a universal gate?
How does the logic circuit translate mathematical addition into digital operations?
What challenges did you encounter when wiring the gates, and how did you troubleshoot?

Evaluation

Successfully building and testing each logic gate: 30%
Constructing a working half adder: 30%
Constructing a working full adder: 30%
Participation and troubleshooting explanation: 10%
This hands-on lab will introduce students to essential principles of digital logic and computational thinking while fostering practical skills in circuit building.
Let’s create an additional lab instruction to detail how to make a truth table to connect this for the student's understand as to how this leads up to more sophisticated applications such as a word processor, so they will realize why CSD 1133 is being introduced with this exercise


Lab Extension: Creating and Understanding Truth Tables for Logic Circuits

Objective

Learn how to construct truth tables for the logic gates and circuits built in the previous lab. Understand how these fundamental building blocks form the foundation of complex applications like a word processor.

Materials

Pen and paper or a spreadsheet tool
The completed circuits (NOT, AND, OR gates, half adder, full adder)
Push-button inputs and LED outputs

Background

Truth tables systematically list all possible inputs and their corresponding outputs for a digital circuit. By visualizing how each input combination maps to an output, students can see the deterministic nature of digital logic. This principle scales up to create intricate systems like processors, which ultimately enable applications like a word processor.

Steps

Part 1: Constructing Truth Tables for Basic Gates

NOT Gate:
Inputs: AAA (0 or 1).
Outputs: NOT A\text{NOT A}NOT A.
Table format:
Table 1
Input AAA
Output NOT A\text{NOT A}NOT A
1
0
1
2
1
0
There are no rows in this table
AND Gate:
Inputs: A,BA, BA,B (00, 01, 10, 11).
Outputs: A⋅BA \cdot BA⋅B.
Table format:
Table 2
Input AAA
Input BBB
Output A⋅BA \cdot BA⋅B
1
0
0
0
2
0
1
0
3
1
0
0
4
1
1
1
There are no rows in this table
OR Gate:
Inputs: A,BA, BA,B.
Outputs: A+BA + BA+B.
Table format:
Table 3
Input AAA
Input BBB
Output A+BA + BA+B
1
0
0
0
2
0
1
1
3
1
0
1
4
1
1
1
There are no rows in this table

Part 2: Truth Tables for Half and Full Adders

Half Adder:
Inputs: A,BA, BA,B.
Outputs: S=A⊕BS = A \oplus BS=A⊕B, C=A⋅BC = A \cdot BC=A⋅B.
Table format:
Table 4
Input AAA
Input BBB
Sum SSS
Carry CCC
1
0
0
0
0
2
0
1
1
0
3
1
0
1
0
4
1
1
0
1
There are no rows in this table
Full Adder:
Inputs: A,B,CinA, B, CinA,B,Cin (all combinations of three binary inputs).
Outputs: S=(A⊕B)⊕CinS = (A \oplus B) \oplus CinS=(A⊕B)⊕Cin, Cout=(A⋅B)+(Sum1⋅Cin)Cout = (A \cdot B) + (\text{Sum}_{1} \cdot Cin)Cout=(A⋅B)+(Sum1​⋅Cin).
Table format:
Table 5
AAA
BBB
CinCinCin
Sum SSS
Carry CoutCoutCout
1
0
0
0
0
0
2
0
0
1
1
0
3
0
1
0
1
0
4
0
1
1
0
1
5
1
0
0
1
0
6
1
0
1
0
1
7
1
1
0
0
1
8
1
1
1
1
1
There are no rows in this table

Part 3: Linking Logic to Applications

Discuss how the adder circuit is a fundamental part of the Arithmetic Logic Unit (ALU) in a CPU:
Adders in CPUs:
The binary addition demonstrated here is used in arithmetic operations, decision-making, and memory addressing in processors.
These simple circuits are scaled up and integrated to handle more bits and operations.
Word Processors and Software:
A word processor relies on the CPU to perform tasks like text manipulation and display rendering.
The CPU processes binary data through billions of these basic logic operations per second.
Logic gates enable all computation, including loading data from memory, rendering graphics, and even interpreting user input.

Questions for Students

How does a truth table help in verifying the correctness of a circuit?
Describe the relationship between simple circuits and complex software like a word processor.
Can you hypothesize how memory and data storage would rely on logic gates?

Evaluation

Correctly completing truth tables for gates: 20%
Correctly completing truth tables for adders: 40%
Written reflection connecting circuits to word processors: 40%
By completing this lab, students will develop a deeper understanding of how the most basic digital operations scale up to enable modern computing applications.

Learning outcomes:

Let’s open the student's eyes to what a miraculous piece of good fortune it is that we live in a world in which information can be digtized and processed using algorithms, all of which build up from nand gates, nor and not gates which is what CPUs are composed of


Lab Book Section: The Miracle of Digital Information and Logic Gates

The Marvel of Digitized Information

Take a moment to reflect on the world we live in. Every time you type on a computer, send a message, play a video game, or search the internet, you are participating in one of humanity's greatest achievements: the ability to process information digitally.
This achievement isn't just a technological marvel—it's a profound testament to the power of abstraction, logic, and creativity. And at the heart of it all lies something deceptively simple: the humble logic gate.

The Universe of Logic

What is Information?
At its core, information is data that has meaning or value. In the digital world, this data is encoded as sequences of 1s and 0s, known as binary.
Why binary? Because any physical system—whether it's electrical circuits, light pulses, or magnetic storage—can reliably represent two states (on/off, high/low, 1/0).
Logic Gates: The Building Blocks
A logic gate is a tiny electronic circuit that takes one or more binary inputs and produces a single binary output based on a simple rule.
With just a few types of gates—NAND, NOR, and NOT—we can build any digital system imaginable.
Universal Gates
NAND and NOR are called "universal gates" because they can be combined to recreate all other logic gates. This means that an entire CPU—capable of running millions of apps, games, and websites—can theoretically be built using only NAND gates.

From Logic to Miracles

How Does This Scale?
Logic gates are grouped to form adders, which perform basic arithmetic. Adders are scaled up to create an Arithmetic Logic Unit (ALU), the core of a CPU.
CPUs then interact with memory to store and retrieve information, graphics processors to render images, and input/output systems to communicate with the outside world.
The Algorithmic Revolution
Once information is digitized, algorithms—step-by-step procedures for solving problems—can process, sort, search, and analyze it at speeds far beyond human capability.
Everything from Google search to video streaming and AI relies on algorithms built upon the foundational principles of logic gates.
Applications Everywhere
Word processors translate keypresses into displayed characters using binary representations and CPU instructions.
Video calls compress, transmit, and decode audiovisual data—all using mathematical transformations implemented via logic gates.
Scientific discoveries, weather forecasting, and medical imaging are driven by computations unimaginable without the ability to digitize and process data.

Why This Matters

Imagine a world where this wasn’t possible—where every calculation had to be done manually or communicated through analog signals prone to noise and error. Humanity might still be struggling with tasks we now take for granted. The realization that binary logic and physical systems could work together to represent and process information has propelled society into the information age, transforming every aspect of our lives.

Reflection Activity: The Bigger Picture

Thought Experiment:
What if logic gates didn’t work? Could humanity have achieved the digital age some other way?
How does understanding the simplicity of logic gates deepen your appreciation for the complexity of modern technology?
Challenge:
Consider the device you're reading this on. Imagine tracing every operation—from displaying this text to processing your thoughts as you type a response—back to the NAND gates in its CPU. How many layers of abstraction might you encounter?

Key Takeaway

The next time you use your computer or smartphone, remember that every operation—no matter how advanced—depends on the simplest truths of binary logic, flowing through billions of NAND, NOR, and NOT gates. It's not just technology; it's a miracle of human ingenuity and the nature of the universe itself.

Learning Outcomes:

Discribe how real world IC cpu chips contain embedded microcode which enables von Neumann computers to run software such as language compilers

Embedded Microcode in Real-World CPUs: Bridging Hardware and Software

Modern CPUs are marvels of engineering that allow us to execute complex software applications.
c1To understand how they achieve this, let’s delve into the concept of embedded microcode and how it enables Von Neumann computers to run software like language compilers.
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.