5-17 RNN
- why
- we want the model to remember things from the past so that it can store and use context information
- types of context:
- Short term memory
- immediate preceding words
- Long term memory
- not that immediate preceding words
- how
- use the same function over and over again
(OPTIONAL) 18-23 LSTM
GAN
- Generative Adversarial Ntwks
- 2 nn models contest with each other:
- Generative model that learns the training set and generates shits based on that learning
- Discriminative model that guess if the shits from generative model is OG or is fake
- Goal:
- For generative model to generate increasingly confusing shits for discriminative model
- For discriminative model to be more capable of telling fake shits apart
- Ultimate goal is to let generate model produce really confusing shits to tell if it’s fake or not
Encoder & Decoder, Transformer, Attention
- Encoder & Decoder:
- Def:
- Eg. fuck you → 😅 → cnm
- The 1st step is encoding, 2nd step is decoding
- 😅here is the “hidden state” for that model
- Embedding
- It is possible to represent each word with an id, this can make each word to be unique
- But this makes the lingusiudhwiufh context lost
- So we use embeddings now to capture those fucking contexts as well
- How can we get those embeddings?
- Transformer, attention, bert:
Post Views: 37
Comments are closed