Sequence Transformer Project

About This Project

This website showcases the Sequence Transformer project, a machine learning model designed to learn and predict numerical sequence patterns such as linear, quadratic, logarithmic, Fibonacci, Collatz, and more.

Abstract

This project investigates the capability of a hybrid transformer architecture for learning and predicting numerical sequence patterns. We generated a dataset that labeled ten distinct sequence types: constant, linear, quadratic, logarithmic, prime numbers, Collatz sequences, arithmetic progressions, Fibonacci sequences, noisy sinusoidal patterns, and geometric progressions. Our model extends a standard transformer encoder by incorporating multiple prediction heads (delta, ratio, and direct value), type embeddings, and a learned gating mechanism that combines prediction strategies. Training is performed by normalizing the synthetic dataset to have zero mean and standard deviation one. Then through the sliding window technique we ask our network to make a prediction, then compute the errors and adjust the weights to minimize these errors. Experimental results show that the transformer architecture accurately captures patterns in deterministic sequences and demonstrates partial success on chaotic sequences (Collatz, Fibonacci, prime numbers), revealing limitations due to the irregular structure. This study demonstrates that transformers can serve as sequence learners, however they come with limitations.

Paper

📄 Open full paper as PDF

View Project Repository

Project Video

<