Этот материал находится в платной подписке. Оформи премиум подписку и смотри или слушай
Local LLMs via Ollama & LM Studio - The Practical Guide,
а также все другие курсы, прямо сейчас!
Премиум
-
Урок 1. 00:00:19Welcome To The Course!
-
Урок 2. 00:06:28What Exactly Are "Open LLMs"?
-
Урок 3. 00:06:53Why Would You Want To Run Open LLMs Locally?
-
Урок 4. 00:03:44Popular Open LLMs - Some Examples
-
Урок 5. 00:04:48Where To Find Open LLMs?
-
Урок 6. 00:07:18Running LLMs Locally - Available Options
-
Урок 7. 00:04:05Check The Model Licenses!
-
Урок 8. 00:01:21Module Introduction
-
Урок 9. 00:04:22LLM Hardware Requirements - First Steps
-
Урок 10. 00:05:35Deriving Hardware Requirements From Model Parameters
-
Урок 11. 00:06:51Quantization To The Rescue!
-
Урок 12. 00:05:51Does It Run On Your Machine?
-
Урок 13. 00:02:04Module Introduction
-
Урок 14. 00:01:09Running Locally vs Remotely
-
Урок 15. 00:03:10Installing & Using LM Studio
-
Урок 16. 00:09:05Finding, Downloading & Activating Open LLMs
-
Урок 17. 00:04:54Using the LM Studio Chat Interface
-
Урок 18. 00:03:27Working with System Prompts & Presets
-
Урок 19. 00:02:33Managing Chats
-
Урок 20. 00:06:29Power User Features For Managing Models & Chats
-
Урок 21. 00:02:49Leveraging Multimodal Models & Extracting Content From Images (OCR)
-
Урок 22. 00:03:28Analyzing & Summarizing PDF Documents
-
Урок 23. 00:01:53Onwards To More Advanced Settings
-
Урок 24. 00:06:33Understanding Temperature, top_k & top_p
-
Урок 25. 00:04:46Controlling Temperature, top_k & top_p in LM Studio
-
Урок 26. 00:04:18Managing the Underlying Runtime & Hardware Configuration
-
Урок 27. 00:05:22Managing Context Length
-
Урок 28. 00:05:09Using Flash Attention
-
Урок 29. 00:05:30Working With Structured Outputs
-
Урок 30. 00:02:36Using Local LLMs For Code Generation
-
Урок 31. 00:05:22Content Generation & Few Shot Prompting (Prompt Engineering)
-
Урок 32. 00:02:26Onwards To Programmatic Use
-
Урок 33. 00:06:01LM Studio & Its OpenAI Compatibility
-
Урок 34. 00:05:05More Code Examples!
-
Урок 35. 00:02:11Diving Deeper Into The LM Studio APIs
-
Урок 36. 00:01:42Module Introduction
-
Урок 37. 00:02:09Installing & Starting Ollama
-
Урок 38. 00:02:57Finding Usable Open Models
-
Урок 39. 00:07:44Running Open LLMs Locally via Ollama
-
Урок 40. 00:02:13Adding a GUI with Open WebUI
-
Урок 41. 00:02:39Dealing with Multiline Messages & Image Input (Multimodality)
-
Урок 42. 00:03:32Inspecting Models & Extracting Model Information
-
Урок 43. 00:06:02Editing System Messages & Model Parameters
-
Урок 44. 00:03:36Saving & Loading Sessions and Models
-
Урок 45. 00:05:43Managing Models
-
Урок 46. 00:06:23Creating Model Blueprints via Modelfiles
-
Урок 47. 00:03:27Creating Models From Modelfiles
-
Урок 48. 00:06:40Making Sense of Model Templates
-
Урок 49. 00:06:38Building a Model From Scratch From a GGUF File
-
Урок 50. 00:02:13Getting Started with the Ollama Server (API)
-
Урок 51. 00:05:19Exploring the Ollama API & Programmatic Model Access
-
Урок 52. 00:02:57Getting Structured Output
-
Урок 53. 00:04:54More Code Examples!
-
Урок 54. 00:01:45Roundup