gpt2tc: Text Completion and Compression using GPT-2

gpt2tc is a small program using the GPT-2 language model to complete and compress (English) texts. It has no external dependency, requires no GPU and is quite fast. The smallest model (117M parameters) is provided. Larger models can be downloaded as well.

The compression ratios are much higher than conventional compressors at the expense of speed and of a much larger decompressor. See the documentation to get results on text files from well known compression data sets.

The web site textsynth.org relies on gpt2tc.

Documentation

Download


Fabrice Bellard - https://bellard.org/