AI: Introduction to Ollama for local LLM launch0 (0)
31 May 2025
I would really like to play with some LLMs locally, because it will allow to better understand the nuances of their work. It’s like getting acquainted with AWS without having dealt with at least VirtualBox before – working with the AWS Console or AWS API will not give an understanding of what is happening under… Read More »
![]()