000 02817nam a22004217i 4500
001 1456030850
003 JO-AjAnu
005 20241110100328.0
006 m o d
007 cr cnu|||unuuu
008 240916s2024 enka ob 001 0 eng d
020 _z9781805128724
035 _a(OCoLC)1456030850
037 _a9781805128724
_bO'Reilly Media
040 _aORMDA
_beng
_erda
_epn
_cORMDA
_dOCLCO
_dJO-AjAnu
050 4 _aQ336
_b2024
082 0 4 _a006.3
_223/eng/20240916
100 1 _aRothman, Denis,
_eauthor
245 1 0 _aTransformers for natural language processing and computer vision :
_bexplore generative AI and large language models with Hugging Face, ChatGPT, GPT-4V, and DALL-E3 /
_cDenis Rothman
250 _aThird edition
264 1 _aBirmingham, UK :
_bPackt Publishing Ltd.,
_c2024
300 _a1 online resource (730 pages) :
_billustrations
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
490 1 _aExpert insight
504 _aIncludes bibliographical references and index
506 _aAvailable to OhioLINK libraries
520 _aTransformers for Natural Language Processing and Computer Vision, Third Edition, explores Large Language Model (LLM) architectures, applications, and various platforms (Hugging Face, OpenAI, and Google Vertex AI) used for Natural Language Processing (NLP) and Computer Vision (CV). The book guides you through different transformer architectures to the latest Foundation Models and Generative AI. You’ll pretrain and fine-tune LLMs and work through different use cases, from summarization to implementing question-answering systems with embedding-based search techniques. You will also learn the risks of LLMs, from hallucinations and memorization to privacy, and how to mitigate such risks using moderation models with rule and knowledge bases. You’ll implement Retrieval Augmented Generation (RAG) with LLMs to improve the accuracy of your models and gain greater control over LLM outputs. Dive into generative vision transformers and multimodal model architectures and build applications, such as image and video-to-text classifiers. Go further by combining different models and platforms and learning about AI agent replication. This book provides you with an understanding of transformer architectures, pretraining, fine-tuning, LLM use cases, and best practices
630 0 0 _aChatGPT
650 0 _aArtificial intelligence
_xData processing
650 0 _aNatural language processing (Computer science)
650 0 _aCloud computing
710 2 _aOhio Library and Information Network
830 0 _aExpert insight
856 4 0 _3O'Reilly
_zConnect to resource
_uhttps://learning.oreilly.com/library/view/~/9781805128724/?ar
942 _2lcc
_cBK
_n0
999 _c33847
_d33847