Hello transformers -- Text classification -- Transformer anatomy -- Multilingual named entity recognition -- Text generation -- Summarization -- Question answering -- Making transformers efficient in production -- Dealing with few to no labels -- Training transformers from scratch -- Future directions.
Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.
The online book link https://www.amazon.com/s?k=9781098103248&i=stripbooks-intl-ship&crid=30WBRN1TE95D4&sprefix=9781098103248%2Cstripbooks-intl-ship%2C474&ref=nb_sb_noss
1098103246 9781098103248 9781098136796 1098136799
2023275986
GBC202041 bnb GBC2B0429 bnb
020443148 Uk 020659171 Uk
Natural language processing (Computer science) Electronic transformers.--Natural language processing (Computer science) Natural Language Processing--Machine learning Traitement automatique des langues naturelles. Transformateurs électroniques.--Python (Computer program language) Electronic transformers. Natural language processing (Computer science) Natural language processing (Computer science) Electronic transformers