Calculus – Transcendental Functions

References: Transcendental Functions In mathematics, a transcendental function is a function that does not satisfy a polynomial equation with polynomial coefficients. To put it simply: it is any function that is not algebraic. While algebraic functions can be constructed using a finite number of elementary operations (addition, subtraction, multiplication, division, and taking roots), transcendental functions ... Read More

Transformer Part I – Theory, Original Paper etc.

Resources: Transformer ‌Transformer is an encoder and decoder model. It uses a mechanism called Attention. Transformer model consists of encoder and decoder. Analogy Imagine you are at a loud party. You hear fragments of conversations. To understand a specific sentence, your brain does three things instantly: Transformer Architecture The Transformer architecture is an Encoder-Decoder structure. ... Read More

LM, N-Grams, RNNs, LLM, Fine Tuning, LoRA, QLoRA – Part III

LLMs: Fine-tuninghttps://developers.google.com/machine-learning/crash-course/llm/tuning Foundation Models A Foundation LLM (or base/pre-trained model) is a general-purpose model trained on vast amounts of data. It understands grammar and can perform creative tasks like writing poetry. However, to solve specific problems (like classification or regression), it often serves as a starting platform rather than a finished solution. Fine Tuning Fine-tuning ... Read More

LM, N-Grams, RNNs, LLM, Fine Tuning, LoRA, QLoRA – Part II

Introduction to Large Language Modelshttps://developers.google.com/machine-learning/crash-course/llm LLMs: What is a Large Language Model? An LLM is a predictive technology that estimates the next “token” (word, character, or subword) in a sequence. They outperform older models (like N-grams) because they use vastly more parameters and can process significantly more context at once. Transformer It is most successful ... Read More

LM, N-Grams, RNNs, LLM, Fine Tuning, LoRA, QLoRA – Part I

Introduction to Large Language Modelshttps://developers.google.com/machine-learning/crash-course/llm What is Language Model? At its simplest, a language model is a statistical tool that predicts the next piece of text in a sequence. The N-gram Approach Early language models used “N-grams,” which are simply ordered sequences of words where N represents the number of words. Context Context refers to ... Read More

Quantum Error Correction Challenges – Quantum Computing

Books References: Exploration in Quantum Computing by Colin P. WilliamsQuantum Computation and Quantum Information by Nielsen et. al Quantum error correction (QEC) is widely considered the “holy grail” of quantum computing and also its most formidable bottleneck. While classical computers can easily correct errors by copying data (redundancy), quantum computers are forbidden from doing this ... Read More

Data Structure and Algorithms – An Overview

T. H. Cormen, Ed., Introduction to algorithms, 2nd. ed., 10th pr. Cambridge, Mass.: MIT Press [u.a.], 2007. Data Structure and Algorithsm, Sixth Edition, GoodRich et al., Wiley What is a data structure? https://www.ibm.com/think/topics/data-structure What are algorithms? An Algorithm is simply a specific, step-by-step set of instructions used to complete a task or solve a problem. ... Read More

Quantum Error Correction – Quantum Computing

Books References: In an idea set up, we assume that logical qubits evolves unitarily following Schrodinger’s equation from the moment quantum comptuer is prepared, go through some computations and measured. However, during this process, real quantum system couples with its environment causing information leak out of logical state of the qubits in the quantum memory ... Read More
error: