著者にメールを送る

You can use this page to email Afshine Amidi、Shervine Amidi、そしてYoshiyuki Nakai about Super Study Guide: Transformer と大規模言語モデル.

Please include an email address so the author can respond to your query

This message will be sent to Afshine Amidi、Shervine Amidi、そしてYoshiyuki Nakai

This site is protected by reCAPTCHA and the Google  Privacy Policy and  Terms of Service apply.

本について

この本は、面接対策、プロジェクトでの利用、あるいは純粋な好奇心のために、大規模言語モデル (LLM) の仕組みを理解したいすべての方に向けて、簡潔に図解された学習ガイドです。

以下の 5 つの章で構成されています。

  • 基礎:ニューラルネットワーク入門、トレーニングと評価に必要な深層学習の概念
  • 埋め込み表現:トークン化アルゴリズム、単語埋め込み (word2vec)、文の埋め込み (RNN・LSTM・GRU)
  • Transformer:Self-Attention 機構の導入目的、エンコーダ・デコーダ構造の詳細と BERT・GPT・T5 などのモデル、計算を高速化するための工夫
  • 大規模言語モデル:プロンプトエンジニアリング、(パラメータ効率の良い)ファインチューニング、プリファレンスチューニングなど、Transformer ベースのモデルを調整するための主な手法
  • 応用:感情抽出、機械翻訳、検索拡張生成などを含む一般的な課題

このページでは、以下の理由により、皆様ご自身で 15% の割引を適用可能です。皆様のご興味とご支援に感謝いたします!

  1. 書籍版を購入済みのお客様への特典
  2. 地域間の購買力の差への配慮

著者たちについて

Afshine Amidi’s avatar Afshine Amidi

@afshinea

Afshine Amidi is currently teaching the Transformers & Large Language Models workshop at Stanford and is also leading LLM efforts at Netflix. Previously, he worked on the Gemini team at Google and used NLP techniques to solve complex queries. Before that, he worked at Uber Eats to improve the quality of the search and recommendation systems. On the side, Afshine published a few papers at the intersection of deep learning and computational biology. He holds a Bachelor’s and a Master’s Degree from École Centrale Paris and a Master’s Degree from MIT.

Shervine Amidi’s avatar Shervine Amidi

@shervinea

Shervine Amidi is currently teaching the Transformers & Large Language Models workshop at Stanford and is also working on the Gemini team at Google to leverage LLMs for action-based queries. Previously, he worked on applied machine learning problems for recommender systems at Uber Eats where he focused on representation learning to better surface dish recommendations. On the side, Shervine published a few papers at the intersection of deep learning and computational biology. He holds a Bachelor’s and a Master’s Degree from École Centrale Paris and a Master’s Degree from Stanford University.

Yoshiyuki Nakai’s avatar Yoshiyuki Nakai

@yoshiyukinakai

Logo white 96 67 2x

Publish Early, Publish Often

  • Path
  • There are many paths, but the one you're on right now on Leanpub is:
  • Transformer-daikibo-gengo-moderu › Email Author › New
    • READERS
    • Newsletters
    • Weekly Sale
    • Monthly Sale
    • Store
    • Home
    • Redeem a Token
    • Search
    • Support
    • Leanpub FAQ
    • Leanpub Author FAQ
    • Search our Help Center
    • How to Contact Us
    • FRONTMATTER PODCAST
    • Featured Episode
    • Episode List
    • MEMBERSHIPS
    • Reader Memberships
    • Department Reader Memberships
    • Author Memberships
    • Your Membership
    • COMPANY
    • About
    • About Leanpub
    • Blog
    • Contact
    • Press
    • Essays
    • AI Services
    • Imagine a world...
    • Manifesto
    • More
    • Partner Program
    • Causes
    • Accessibility
    • AUTHORS
    • Write and Publish on Leanpub
    • Create a Book
    • Create a Bundle
    • Create a Course
    • Create a Track
    • Testimonials
    • Why Leanpub
    • Services
    • TranslateAI
    • TranslateWord
    • TranslateEPUB
    • PublishWord
    • Publish on Amazon
    • CourseAI
    • GlobalAuthor
    • Marketing Packages
    • IndexAI
    • Author Newsletter
    • The Leanpub Author Update
    • Author Support
    • Author Help Center
    • Leanpub Authors Forum
    • The Leanpub Manual
    • Supported Languages
    • The LFM Manual
    • Markua Manual
    • API Docs
    • Organizations
    • Learn More
    • Sign Up
    • LEGAL
    • Terms of Service
    • Copyright Policy
    • Privacy Policy
    • Refund Policy

*   *   *

Leanpub is copyright © 2010-2025 Ruboss Technology Corp.
All rights reserved.

This site is protected by reCAPTCHA
and the Google  Privacy Policy and  Terms of Service apply.

Leanpub requires cookies in order to provide you the best experience. Dismiss