Hands-On Large Language Models Audiolibro Por Jay Alammar, Maarten Grootendorst arte de portada

Hands-On Large Language Models

Language Understanding and Generation

Vista previa

Prueba gratis de 30 días de Audible Standard

Prueba Standard gratis
Selecciona 1 audiolibro al mes de nuestra colección completa de más de 1 millón de títulos.
Es tuyo mientras seas miembro.
Obtén acceso ilimitado a los podcasts con mayor demanda.
Plan Standard se renueva automáticamente por $8.99 al mes después de 30 días. Cancela en cualquier momento.

Hands-On Large Language Models

De: Jay Alammar, Maarten Grootendorst
Narrado por: Derek Shoales
Prueba Standard gratis

$8.99 al mes después de 30 días. Cancela en cualquier momento.

Compra ahora por $22.66

Compra ahora por $22.66

AI has acquired startling new language capabilities in just the past few years. Driven by rapid advances in deep learning, language AI systems are able to write and understand text better than ever before. This trend is enabling new features, products, and entire industries. With this book, listeners will learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

This book also helps you understand the architecture of Transformer language models that excel at text generation and representation; build advanced LLM pipelines to cluster text documents and explore the topics they cover; build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers; explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation; and gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning.

©2024 Jay Alammar and Maarten Pieter Grootendorst (P)2024 Ascent Audio
Informática Ciencia de datos Aprendizaje automático Programación
Todas las estrellas
Más relevante
The author’s style makes this hard to listen to and the experience dry.

This book has a lot of interesting details. Unfortunately, the author spells out every input, token, training example as literal text: “open parenthetical”, “word one”, “word two”, “comma”, “comma”, “closed parenthetical” and as a result is one of those texts that are likely better as a physical textbook.

open parenthetical, close parenthetical, comma, comma, space

Se ha producido un error. Vuelve a intentarlo dentro de unos minutos.

I dont need to know each quotation mark and space when explaining code fragments.

this is take way too much of the book

The reading of code is done badly

Se ha producido un error. Vuelve a intentarlo dentro de unos minutos.