Transformers for natural language processing : build, train, and fine-tune deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, and GPT-3 /
BONUS OpenAI ChatGPT, GPT-4, and DALL-E notebooks in the book's GitHub repository - Start coding with these SOTA transformers.OpenAI's GPT-3 and Hugging Face transformers for language tasks in one book. Plus, get a taste of the future of transformers, including computer vision tasks and co...
Autor principal: | |
---|---|
Altres autors: | |
Format: | Licensed eBooks |
Idioma: | anglès |
Publicat: |
Birmingham, UK :
Packt Publishing,
[2022]
|
Edició: | Second edition. |
Accés en línia: | https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=3197830 |
Taula de continguts:
- Table of Contents What are Transformers? Getting Started with the Architecture of the Transformer Model Fine-Tuning BERT Models Pretraining a RoBERTa Model from Scratch Downstream NLP Tasks with Transformers Machine Translation with the Transformer The Rise of Suprahuman Transformers with GPT-3 Engines Applying Transformers to Legal and Financial Documents for AI Text Summarization Matching Tokenizers and Datasets Semantic Role Labeling with BERT-Based Transformers Let Your Data Do the Talking: Story, Questions, and Answers Detecting Customer Emotions to Make Predictions Analyzing Fake News with Transformers Interpreting Black Box Transformer Models From NLP to Task-Agnostic Transformer Models The Emergence of Transformer-Driven Copilots The Consolidation of Suprahuman Transformers with OpenAI's ChatGPT and GPT-4' Appendix I
- Terminology of Transformer Models Appendix II
- Hardware Constraints for Transformer Models Appendix III
- Generic Text Completion with GPT-2 Appendix IV
- Custom Text Completion with GPT-2 Appendix V
- Answers to the Questions.