Model Vectors

In this article, we discuss a novel approach to solving number sequence problems, in which sequences of numbers following unstated rules are given, and missing terms are to be inferred. We develop a methodology of decomposing test sequences into linear combinations of known base sequences, and using...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Author Prager, John
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 28.11.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this article, we discuss a novel approach to solving number sequence problems, in which sequences of numbers following unstated rules are given, and missing terms are to be inferred. We develop a methodology of decomposing test sequences into linear combinations of known base sequences, and using the decomposition weights to predict the missing term. We show that if assumptions are made ahead of time of the expected base sequences, then a Model Vector can be created, where a dot-product with the input will produce the result. This is surprising since it means sequence problems can be solved with no knowledge of the hidden rule. Model vectors can be created either by matrix inversion or by a novel combination function applied to primitive vectors. A heuristic algorithm to compute the most likely model vector from the input is described. Finally we evaluate the algorithm on a suite of number sequence problem tests.
ISSN:2331-8422