Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай Local LLMs via Ollama & LM Studio - The Practical Guide, а также все другие курсы, прямо сейчас!
Премиум
  1. Урок 1. 00:00:19
    Welcome To The Course!
  2. Урок 2. 00:06:28
    What Exactly Are "Open LLMs"?
  3. Урок 3. 00:06:53
    Why Would You Want To Run Open LLMs Locally?
  4. Урок 4. 00:03:44
    Popular Open LLMs - Some Examples
  5. Урок 5. 00:04:48
    Where To Find Open LLMs?
  6. Урок 6. 00:07:18
    Running LLMs Locally - Available Options
  7. Урок 7. 00:04:05
    Check The Model Licenses!
  8. Урок 8. 00:01:21
    Module Introduction
  9. Урок 9. 00:04:22
    LLM Hardware Requirements - First Steps
  10. Урок 10. 00:05:35
    Deriving Hardware Requirements From Model Parameters
  11. Урок 11. 00:06:51
    Quantization To The Rescue!
  12. Урок 12. 00:05:51
    Does It Run On Your Machine?
  13. Урок 13. 00:02:04
    Module Introduction
  14. Урок 14. 00:01:09
    Running Locally vs Remotely
  15. Урок 15. 00:03:10
    Installing & Using LM Studio
  16. Урок 16. 00:09:05
    Finding, Downloading & Activating Open LLMs
  17. Урок 17. 00:04:54
    Using the LM Studio Chat Interface
  18. Урок 18. 00:03:27
    Working with System Prompts & Presets
  19. Урок 19. 00:02:33
    Managing Chats
  20. Урок 20. 00:06:29
    Power User Features For Managing Models & Chats
  21. Урок 21. 00:02:49
    Leveraging Multimodal Models & Extracting Content From Images (OCR)
  22. Урок 22. 00:03:28
    Analyzing & Summarizing PDF Documents
  23. Урок 23. 00:01:53
    Onwards To More Advanced Settings
  24. Урок 24. 00:06:33
    Understanding Temperature, top_k & top_p
  25. Урок 25. 00:04:46
    Controlling Temperature, top_k & top_p in LM Studio
  26. Урок 26. 00:04:18
    Managing the Underlying Runtime & Hardware Configuration
  27. Урок 27. 00:05:22
    Managing Context Length
  28. Урок 28. 00:05:09
    Using Flash Attention
  29. Урок 29. 00:05:30
    Working With Structured Outputs
  30. Урок 30. 00:02:36
    Using Local LLMs For Code Generation
  31. Урок 31. 00:05:22
    Content Generation & Few Shot Prompting (Prompt Engineering)
  32. Урок 32. 00:02:26
    Onwards To Programmatic Use
  33. Урок 33. 00:06:01
    LM Studio & Its OpenAI Compatibility
  34. Урок 34. 00:05:05
    More Code Examples!
  35. Урок 35. 00:02:11
    Diving Deeper Into The LM Studio APIs
  36. Урок 36. 00:01:42
    Module Introduction
  37. Урок 37. 00:02:09
    Installing & Starting Ollama
  38. Урок 38. 00:02:57
    Finding Usable Open Models
  39. Урок 39. 00:07:44
    Running Open LLMs Locally via Ollama
  40. Урок 40. 00:02:13
    Adding a GUI with Open WebUI
  41. Урок 41. 00:02:39
    Dealing with Multiline Messages & Image Input (Multimodality)
  42. Урок 42. 00:03:32
    Inspecting Models & Extracting Model Information
  43. Урок 43. 00:06:02
    Editing System Messages & Model Parameters
  44. Урок 44. 00:03:36
    Saving & Loading Sessions and Models
  45. Урок 45. 00:05:43
    Managing Models
  46. Урок 46. 00:06:23
    Creating Model Blueprints via Modelfiles
  47. Урок 47. 00:03:27
    Creating Models From Modelfiles
  48. Урок 48. 00:06:40
    Making Sense of Model Templates
  49. Урок 49. 00:06:38
    Building a Model From Scratch From a GGUF File
  50. Урок 50. 00:02:13
    Getting Started with the Ollama Server (API)
  51. Урок 51. 00:05:19
    Exploring the Ollama API & Programmatic Model Access
  52. Урок 52. 00:02:57
    Getting Structured Output
  53. Урок 53. 00:04:54
    More Code Examples!
  54. Урок 54. 00:01:45
    Roundup