https://drive.google.com/file/d/1RE_52tfI4ehAlDmr641Pi-PvwkarcxfS/view?usp=sharing
This document provides a detailed overview of fundamental concepts in computer science related to algorithms, their analysis, and specific design strategies like "Divide and Conquer."
Algorithm Fundamentals
- Definition: An algorithm is a finite sequence of instructions with clear meanings that can be performed with finite effort and time, always terminating after a finite number of steps.
- Criteria: Every algorithm must satisfy five key criteria:
- Input: Zero or more externally supplied quantities.
- Output: At least one quantity is produced.
- Definiteness: Each instruction must be clear and unambiguous.
- Finiteness: The algorithm must terminate after a finite number of steps for all cases.
- Effectiveness: Instructions must be basic enough to be feasible, even if performed by a person with only pencil and paper.
- Programs vs. Algorithms: Unlike algorithms, programs do not necessarily have to terminate; for instance, an operating system continues in a wait loop until more jobs are entered.
- Representation: Algorithms can be described using natural language (e.g., English), graphic representations (flowcharts), or pseudo-code.
Pseudo-Code Conventions
The document outlines specific conventions for writing pseudo-code:- Structure: Comments start with
//, and blocks are indicated by matching braces{}. - Identifiers and Data Types: Identifiers begin with a letter, and while basic variables aren't explicitly declared, compound data types can be formed using records.
- Statements: Assignment uses
:=, and logical operators includeAND,OR, andNOT. - Flow Control: Common statements include looping (For, While, and Repeat-Until) and conditional statements (If-Then-Else, Case).
- Input/Output: Handled via
readandwriteinstructions.
Performance Analysis
Performance is measured by the amount of computer memory (space complexity) and time (time complexity) needed for a program to run.- Space Complexity: Includes instruction space (to store compiled instructions), data space (for constants, variables, and dynamically allocated objects), and environment stack space (to resume execution of partially completed functions).
- Time Complexity: The amount of computer time needed to run to completion, often expressed as a function of the problem's size.
- Asymptotic Notations: Used to characterize complexity:
- Big-OH (O): Represents the upper bound (growth rate $\le$ to another function).
- Big-OMEGA ($\Omega$): Represents the lower bound (growth rate $\ge$ to another function).
- Big-THETA ($\Theta$): Represents the same order (growth rate equals another function).
- Little-oh (o): Indicates a function that becomes insignificant relative to another as input size approaches infinity.
Classification of Algorithms by Time Complexity
The document classifies algorithms based on their growth rates as input size ($n$) increases:- Constant $O(1)$: Instructions execute a few times regardless of $n$.
- Logarithmic $O(\log n)$: Programs get slightly slower as $n$ grows; common when a problem is repeatedly cut by a constant fraction.
- Linear $O(n)$: A small amount of processing is done for each input element.
- Quadratic $O(n^2)$: Practical only for small problems, often involving double nested loops.
- Exponential $O(2^n)$: Naturally arises as "brute-force" solutions and is rarely appropriate for practical use.
Divide and Conquer Strategy
This design strategy improves efficiency by breaking a problem into smaller instances of the same problem, solving them recursively, and then assembling those solutions. Examples include:- Binary Search: Efficiently finds an element in an ordered list by repeatedly jumping to the middle and eliminating half of the remaining items. Its worst-case complexity is $O(\log n)$.
- Merge Sort: Recursively sorts left and right halves of an array and merges them. It has a time complexity of $O(n \log n)$ but requires extra linear memory for merging.
- Strassen’s Matrix Multiplication: A more efficient alternative to standard matrix multiplication, achieving a complexity of $O(n^{2.81})$ instead of $O(n^3)$ by using seven multiplications instead of eight for $n/2$ subproblems.
- Quick Sort: Partitions an array into two groups based on a pivot element (those less than the pivot and those greater) and recursively sorts the sub-arrays. Its average performance is $O(n \log n)$, though its worst-case is $O(n^2)$.
Comments