COMP10002 Foundations of Algorithms

Concepts

The core ideas you need for the assignment, plus optional demos, tooling, and background if you want extra help.

Needed for assignment

Attention

A mechanism that lets one position decide how much other positions should influence it.

Q, K, V Projections

How one vector is transformed into query, key, and value versions used by attention.

KV Cache

A store of previously computed keys and values so generation can reuse past work.

Practical resources

Annotated Scaffold

A guided reading of the provided `a1.c`, with direct links from the main spec and tooltip-style explanations for the names, arrays, and helper functions you keep seeing.

Sample input file

A small concrete example of the assignment input format, showing what a real input file looks like and how to read each line.

Strings in C

The small amount of C string handling you may need for Stage 1: storing tokens, comparing them, sorting them, and copying them safely.

Toy demo

A concrete numeric walkthrough that connects the assignment data back to token positions and attention weights.

Optional concepts and background

Tokens and Embeddings

How text is split into tokens, then converted into numeric vectors a model can work with.

Transformer explainer

A one-page view of the larger model architecture so you can see where this assignment fits.

Transformer history

Why attention mattered historically, including the 2017 paper and why the impact took time to become obvious.

The main assignment page links the needed material inline at the point where it matters, so you can either read from here or jump in from the spec as needed.