Ask HN: How do you use local LLMs?

What applications(as in use cases, not in software names) have you found for local LLMs that you use yourself on a daily basis, and which LLMs do you use?

10 points | by Haeuserschlucht 134 days ago

3 comments

  • henry_flower 134 days ago
    not exactly daily, but I use llama3.2-vision (via ollama) to generate a .txt file alongside every photo I take, containing a description of the photo. then I can just grep for, say, "selfie":

        $ alias omglol='find -name \*txt | xargs -n50 grep -li'
        $ omglol selfie
        ./10/IMG_20241019_204444.txt
        ./09/IMG_20240930_082108.txt
        ./09/IMG_20240930_082118.txt
        ./07/IMG_20240712_154559.txt
        ./07/IMG_20240712_154554.txt
    
    or to do a slide show:

        $ omglol selfie | sed s/txt/jpg/ | xargs feh
    • pizza 133 days ago
      You might be able to load up all the txt files into the embedded vector db in `llm` so that you could also query them semantically.
    • Haeuserschlucht 134 days ago
      Great idea!
  • thatjoeoverthr 130 days ago
    Small pretrained models are often too “dumb” to be useful, but, if you have a task in which you can tune a model, their pretraining means you can produce an effective model with a shockingly small corpus. These can be more reliable than off the shelf models in an automated process, because even the large pretrained models have a lot of “behaviors” you can trigger with surprising inputs. Most recently I retrained SmolLM2 to translate intents to SDXL prompts.
  • ud0 132 days ago
    I use Deepseek via LLM studio for reading sensitive/non-sensitive docs, contracts, searching bank statements & bills.
    • Haeuserschlucht 130 days ago
      > for reading sensitive/non-sensitive docs, contracts

      Can you elaborate?