Skip to content

Java implementations of essential Data Structures and Algorithms for learning and practice.

License

Notifications You must be signed in to change notification settings

TheCollinsByte/DSA

Repository files navigation

Description

LeetCodeJ is an open-source project where I solve LeetCode challenges using Java. The aim is to provide efficient and well-documented solutions to a variety of coding problems.

Installation

Instructions on how to set up the project on a local machine:

  1. Clone the repository:

    git clone https://github.com/TheCollinsByte/DSA
  2. Navigate to the project directory:

    cd DSA
  3. Ensure you have JDK version 21 installed. You can download it here.

  4. Compile the project:

    ./gradlew build
  5. Run the project:

    ./gradlew :{MODULE_NAME}:test -Dtest.verbose=true

Data Structures

A data structure is the mathematical or logical model of an organization of data.

Classification of Data Structure

  • Linear Data Structure: Data structure in which data elements are arranged sequentially or linearly, where each element is attached to its previous and next adjacent elements. Example: Array, Stack, Queue, Linked List etc.
  • Static Data Structure: has a fixed memory size. It is easier to access the elements in a static data structure. Example: Array.
  • Dynamic Data Structure: In dynamic data structure, the size is not fixed. it can be randomly updated during the runtime which may be considered efficient concerning the memory (space) complexity of the code. Example: Queue, Stack etc
  • Non-Linear Data Structure: Is where data elements are not placed sequentially or linearly. We can't traverse all the elements in a single run only. Example: Trees and Graphs.

Types of Data Structures

  • Arrays: An array is a collection of elements of the same type placed in contiguous memory locations.
  • Linked List: It is a linear data structure, in which the elements are not stored at contiguous memory locations and the elements are linked with each other.
  • Stacks: Follow LIFO (Last In First Out) principle, In this, the last element in the stack will be removed first.
  • Queues: Follow FIFO (First In First Out) principle, In this, the first element stored is removed first.
  • Hash Tables (Hash Maps/Sets): This is a type of data structure that store values which have keys related to each of them.
  • Trees: It is a data structure in which data is organized hierarchically and linked together. Some Examples are the Binary Search tree, Binary tree, Splay tree, AVL tree etc.
  • Heaps: It is specialized tree-based data structure, also called Binary heap in which the tree is a complete binary tree and the data is stored.
  • Graphs: It consists of a set of nodes and edges connecting each other.
  • Tries (Prefix Tree): a tries is a specialized tree-based data structure used to store a dynamic set of strings, where the keys are usually strings. It is often used for tasks such as autocomplete or searching for words in a dictionary. Tries enable efficient retrieval of strings based on their prefixes.
  • Union-Find (Disjoint Set Union (DSU)): this data structure keeps track of a set of elements partitioned into disjoint subsets. It supports two operations: find(determines which subset a particular element is in) and Union (merges two subsets). It is often used in network connectivity and Kruskal's algorithm for finding minimum spanning trees.

Algorithm

  • An algorithm is a well-defined sets of instructions designed that are used to solve problems or perform a task.

Types of Algorithms

  • Recursion: A recursive algorithm is based on recursion. In this case, a problem is broken into several sub-parts and called the same function again and again.
  • Dynamic Programming Algorithm: This algorithm uses the concept of using the already found solution to avoid repetitive calculation of the same part of the problem. It divides the problem into smaller overlapping subproblems and solves them.
  • Greedy Algorithms: In this type of algorithm, the solution is built part by part. The solution for the next part is built based on the immediate benefit of the next part. The one solution that gives the most benefit will be chosen as the solution of the next part.
  • Backtracking: The backtracking algorithm builds the solution by searching among all possible solutions. Using this algorithm, we keep on building the solution following criteria. Whenever a solution fails we trace back to the failure point build on the next solution and continue this process till we find the solution or all possible solutions are looked after.
  • Brute Force Algorithm: It's the simplest approach to a problem. A brute force algorithm is the first approach that comes to finding when we see a problem.
  • Searching Algorithm: are the ones that are used for searching elements or groups of elements from a particular data Structure. They can be of different types based on their approach or the data Structure in which the element should be found.
  • Hashing Algorithm: works similarly to the searching algorithm. But they contain an index with a key ID. In hashing, a key is assigned to a specific data.
  • Sorting Algorithm: is arranging a group of data in a particular manner according to the requirment. Generally sorting algorithms are used to sort groups of data in an increasing or decreasing manner.
  • Divide and Conquer Algorithm: This algorithm breaks a problem into sub-problems, solves a single sub-problem, and merges the solutions to get the final solution. It Consists of the following three steps: Divide, Solve and Combine.
  • Randomized Algorithms: In the randomized algorithm, we use a random number so it gives immediate benefit. The random number helps in deciding the expected outcome.
  • Pattern Searching: These algorithm are useed to find a pattern (substring) within a larger text (string). They help locate the position of a pattern in a given string. Common examples include the Knuth-Morris-Pratt (KMP) and Boyer-Moore algorithms.
  • Geometric Algorithms: These algorithm solve geometric problems like finding the convex hull, intersections of lines, and nearest neighbors. They deal with points, lines and polygons in 2D or 3D Space.
  • Mathematical Algorithms: These are algorithms that solve mathematical problems such as calculating prime numbers, performing matrix operations or solving linear equations. Examples include algorithms for number theory, combinatorics and algebraic problems like the Euclidean algorithm for finding the greatest common divisor (GCD).
  • Bit Algorithms: Bit manipulation algorithms perform operations at the bit level such as flipping, setting or shifting bits. They are efficient in optimizing space and speed for tasks like counting bits or performing XOR operations. Common bit algorithms include bitwise AND, OR and XOR operations.
  • Graph Algorithms: These algorithms deal with problems related to graphs, which are structures made up of vertices (nodes) connected by edges. Graph algorithms include finding the shortest path (Dijkstra's, Bellman-Ford), detecting cycles, or finding minimum spanning trees (Kruskal's, Prim's).
  • Branch and Bound: This is an optimization algorithm used for solving combinatorial problems, such as the Traveling Salesman Problem or Knapsack Problem. The algorithm explores all possible solutions but "bounds" parts of the search space that cannot lead to an optimal solution to avoid unnecessary exploration.

Analysis Of Algorithms

Analysis of algorithm deals in finding the best algorithm which runs fast and takes in less memory.

  • Time Complexity
  • Space Complexity

Time Complexity

  • Amount of time taken by algorithm to run.
  • The input processed by an algorithm helps in determining the time Complexity.

Space Complexity

  • Amount of memory or space taken by algorithm to run.
  • The memory required to process the input by an algorithm helps in determining the space complexity.

Asymptotic Analysis of an Algorithm

  • Asymptotic analysis helps in evaluating performance of an algorithm in terms of input size and its behaviour as the input size approaches infinity.
  • Using asymptotic analysis we don't measure actual running time of algorithm.
  • It helps in determining how time and space taken by algorithm increases with input size.

Asymptotic Notations

  • Asymptotic Notations are the mathematical tools used to describe the running time of an algorithm in terms or input size.
  • Example - Performance of car in 1 liter of petrol
Highway (Min Traffic) - 25 km/litre
City (Max Traffic) - 15 km/litre
City + Highway (average Traffic) - 20 km/litre

Asymptotic Notations helps us in determine:

  • Best Case
  • Average Case
  • Worst Case

Types of Asymptotic Notations

There are three notations for performing runtime analysis of an algorithm

  • Omega (Ω) Notation
  • Big O (O) Notation
  • Theta (Θ) Notation

Omega (Ω) Notation

  • The formal way to express the lower bound of an algorithm's running time.
  • Lower bound means for any given input this notation determines best amount of time an algorithm can take to complete.

Big O (O) Notation

  • The formal way to express the upper bound of an algorithm running time.
  • Upper bound means for any given input this notation determines longest amount of time an algorithm can take to complete.
  • Worst case analysis

Rules of Big O (O) Notation

  • It's a single processor
  • It performs sequential Execution of Statements
  • Assignment operation takes 1 unit of time. eg: int x = 5 -> 1 Unit of Time
  • Return Statement takes in 1 unit of time. eg: return x -> 1 unit of Time
  • Arithmetic operation takes 1 unit of time. eg: x + y -> 1 unit of Time
  • Logical operation takes 1 unit of time. eg: x && y -> 1 unit of Time
  • Other small/single operations takes 1 unit of time.
  • Drop lower order terms. eg: Polynomial Equation: T = n2 + 3n + 1 => O(n2)
  • Drop constant multipliers. eg: Polynomial Equation: T = 3n2 + 6n + 1 => O(n2)

Theta (Θ) Notation

  • The formal way to express both the upper and lower bound of an algorithm running time.
  • By Lower and Upper bound means for any given input this notation determines average amount of time an algorithm can take to complete.
  • Average Case analysis

Contributing

Contributions are Welcome! Please follow these steps:

  1. Fork the repository.
  2. create a new branch (git checkout -b name-dsa-module)
  3. Make your changes and commit them (git commit -m "DSA: Solutions")
  4. Push to the branch (git push origin name-dsa-module).
  5. Create a new Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.



⭐ hit the star button if you found this useful ⭐

Source | Twitter | LinkedIn | Email

About

Java implementations of essential Data Structures and Algorithms for learning and practice.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages