
Episode 128 - Into the Void... local AI coding with local LLMs for real this time!
カートのアイテムが多すぎます
カートに追加できませんでした。
ウィッシュリストに追加できませんでした。
ほしい物リストの削除に失敗しました。
ポッドキャストのフォローに失敗しました
ポッドキャストのフォロー解除に失敗しました
-
ナレーター:
-
著者:
このコンテンツについて
Firstly, I am not sure what AI did to generate that title image!
As mentioned in the shameless plug, I will be starting to do some further deep dives/one-to-one things soon via Patreon… if you are interested of course! (see links below).
Anyway, plugging aside, I did a video a few months ago about Cursor AI (a VS Code based AI editor) and using local LLMs. Although you can use local LLMs (via Ollama or LMStudio etc.), the experience wasn’t very connecting :)
VOID AI Editor addresses this. In fact, it is an open-sourced version of Cursor AI, still based on VS Code, but absolutely does integrate directly with local LLMs such as Llama 4, DeepSeek etc.
Hey, this is a free podcast however, if you feel you want to support me then check out Patreon. I will have some more detailed deep dives for Patreon members as well as one-to-one sessions.
Or just buy a unicorn a coffee here!
Oh, and yes, I have ended up on YouTube (doesn’t everyone eventually?):
https://www.youtube.com/@justfifteenmins but don’t worry, no ugly face to worry about (yet!).
Unicorn University:
https://applied-ai-services.kit.com/020c6040f2
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit www.justfivemins.com