Decoding GPT : an intuitive understanding of large language models / Devesh Rajadhyax
Material type:
- 9788119445790
- 006.3 RAJ-D
Item type | Current library | Collection | Shelving location | Call number | Copy number | Status | Date due | Barcode | Item holds | |
---|---|---|---|---|---|---|---|---|---|---|
![]() |
BITS Pilani Hyderabad | 003-007 | General Stack (For lending) | 006.3 RAJ-D (Browse shelf(Opens below)) | INR 449.00 | Checked out | 10/11/2025 | 49106 |
Browsing BITS Pilani Hyderabad shelves, Shelving location: General Stack (For lending), Collection: 003-007 Close shelf browser (Hides shelf browser)
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
||
006.3 PUD-V Data mining / | 006.3 PUD-V Data mining / | 006.3 RAH-W AI and machine learning / | 006.3 RAJ-D Decoding GPT : an intuitive understanding of large language models / | 006.3 RIV-J Practical tensorflow.js: deep learning in web app development / | 006.3 ROI-R Data mining : a tutorial based primer / | 006.3 SAR-S Artificial intelligence : |
In a world where Large Language Models (LLMs) like ChatGPT have ignited imaginations, individuals from all walks of life are eager to embrace the transformative potential of Generative AI. Whether you're a tech professional, decision-maker, an entrepreneur or a budding student, the pursuit of understanding this new paradigm is a shared endeavor. It's within this landscape that 'Decoding GPT: an Intuitive Introduction to LLMs' emerges as your essential guide.
Now, as the author of "Decoding GPT," Devesh Rajadhyax invites you to join him on a journey into the heart of LLMs. This book starts with the fundamentals of machine learning and neural networks and then dives into the inner workings of Large Language Models, all while keeping complex math and programming at bay. Instead, it employs clear diagrams and relatable examples to foster a deep understanding. If your aim is to thrive in the world of generative AI, 'Decoding GPT' is your passport to a brighter future in this exciting field.
There are no comments on this title.