Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай Local LLMs via Ollama & LM Studio - The Practical Guide, а также все другие курсы, прямо сейчас!
Премиум
  • Урок 1. 00:00:19
    Welcome To The Course!
  • Урок 2. 00:06:28
    What Exactly Are "Open LLMs"?
  • Урок 3. 00:06:53
    Why Would You Want To Run Open LLMs Locally?
  • Урок 4. 00:03:44
    Popular Open LLMs - Some Examples
  • Урок 5. 00:04:48
    Where To Find Open LLMs?
  • Урок 6. 00:07:18
    Running LLMs Locally - Available Options
  • Урок 7. 00:04:05
    Check The Model Licenses!
  • Урок 8. 00:01:21
    Module Introduction
  • Урок 9. 00:04:22
    LLM Hardware Requirements - First Steps
  • Урок 10. 00:05:35
    Deriving Hardware Requirements From Model Parameters
  • Урок 11. 00:06:51
    Quantization To The Rescue!
  • Урок 12. 00:05:51
    Does It Run On Your Machine?
  • Урок 13. 00:02:04
    Module Introduction
  • Урок 14. 00:01:09
    Running Locally vs Remotely
  • Урок 15. 00:03:10
    Installing & Using LM Studio
  • Урок 16. 00:09:05
    Finding, Downloading & Activating Open LLMs
  • Урок 17. 00:04:54
    Using the LM Studio Chat Interface
  • Урок 18. 00:03:27
    Working with System Prompts & Presets
  • Урок 19. 00:02:33
    Managing Chats
  • Урок 20. 00:06:29
    Power User Features For Managing Models & Chats
  • Урок 21. 00:02:49
    Leveraging Multimodal Models & Extracting Content From Images (OCR)
  • Урок 22. 00:03:28
    Analyzing & Summarizing PDF Documents
  • Урок 23. 00:01:53
    Onwards To More Advanced Settings
  • Урок 24. 00:06:33
    Understanding Temperature, top_k & top_p
  • Урок 25. 00:04:46
    Controlling Temperature, top_k & top_p in LM Studio
  • Урок 26. 00:04:18
    Managing the Underlying Runtime & Hardware Configuration
  • Урок 27. 00:05:22
    Managing Context Length
  • Урок 28. 00:05:09
    Using Flash Attention
  • Урок 29. 00:05:30
    Working With Structured Outputs
  • Урок 30. 00:02:36
    Using Local LLMs For Code Generation
  • Урок 31. 00:05:22
    Content Generation & Few Shot Prompting (Prompt Engineering)
  • Урок 32. 00:02:26
    Onwards To Programmatic Use
  • Урок 33. 00:06:01
    LM Studio & Its OpenAI Compatibility
  • Урок 34. 00:05:05
    More Code Examples!
  • Урок 35. 00:02:11
    Diving Deeper Into The LM Studio APIs
  • Урок 36. 00:01:42
    Module Introduction
  • Урок 37. 00:02:09
    Installing & Starting Ollama
  • Урок 38. 00:02:57
    Finding Usable Open Models
  • Урок 39. 00:07:44
    Running Open LLMs Locally via Ollama
  • Урок 40. 00:02:13
    Adding a GUI with Open WebUI
  • Урок 41. 00:02:39
    Dealing with Multiline Messages & Image Input (Multimodality)
  • Урок 42. 00:03:32
    Inspecting Models & Extracting Model Information
  • Урок 43. 00:06:02
    Editing System Messages & Model Parameters
  • Урок 44. 00:03:36
    Saving & Loading Sessions and Models
  • Урок 45. 00:05:43
    Managing Models
  • Урок 46. 00:06:23
    Creating Model Blueprints via Modelfiles
  • Урок 47. 00:03:27
    Creating Models From Modelfiles
  • Урок 48. 00:06:40
    Making Sense of Model Templates
  • Урок 49. 00:06:38
    Building a Model From Scratch From a GGUF File
  • Урок 50. 00:02:13
    Getting Started with the Ollama Server (API)
  • Урок 51. 00:05:19
    Exploring the Ollama API & Programmatic Model Access
  • Урок 52. 00:02:57
    Getting Structured Output
  • Урок 53. 00:04:54
    More Code Examples!
  • Урок 54. 00:01:45
    Roundup