Tag Contrib
Textbooks
| AI-ML Youtube Channels
|
From the illustrated transformer
| |
Bert Pytorch
Huggingface | Andrej Karpathy
|
DatasetsFree Dolly: Introducing the World's First Truly Open Instruction-Tuned LLM |
Follow-up works
- Depthwise Separable Convolutions for Neural Machine Translation
- One Model To Learn Them All
- Discrete Autoencoders for Sequence Models
- Generating Wikipedia by Summarizing Long Sequences
- Image Transformer
- Training Tips for the Transformer Model
- Self-Attention with Relative Position Representations
- Fast Decoding in Sequence Models using Discrete Latent Variables
- Adafactor: Adaptive Learning Rates with Sublinear Memory Cost
Fundamental Tutorials
- Book :: DAVID SILVER :: UCL Course on RL https://www.davidsilver.uk/teaching/
Self-instruct:
- https://github.com/yizhongw/self-instruct
- Self-Instruct: Aligning Language Models with Self-Generated Instructions https://arxiv.org/abs/2212.10560
- https://github.com/tatsu-lab/stanford_alpaca
- LLaMA: Open and Efficient Foundation Language Models (not actually open?)
- Not really open then: https://crfm.stanford.edu/2023/03/13/alpaca.html
EleutherAI
- https://www.eleuther.ai/releases
- https://en.wikipedia.org/wiki/EleutherAI
- Pythia, A suite of models designed to enable controlled scientific research on transparently trained LLMs https://github.com/EleutherAI/pythia
Code