#ai #python #reading-list #tools

🔗 System-wide text summarization using Ollama and AppleScript
sheshbabu.com

Local LLMs like Mistral, Llama etc allow us to run ChatGPT like large language models locally inside our computers. Since all the processing happens within our systems, I feel more comfortable feeding it personal data compared to hosted LLMs.

A quick way to get started with Local LLMs is to use an application like Ollama. It's very easy to install, but interacting with it involves running commands on a terminal or installing other server based GUI in your system.

These are not a huge dealbreakers, but wouldn't it be nice if you can select a piece of text in any application and ask the LLM to summarize it?

Wouldn't it be nice if you can select a piece of text in any application and ask the LLM to summarize it?

continue reading on sheshbabu.com

⚠️ This post links to an external website. ⚠️