A PopClip extension for ChatGPT

Yes, certainly. The details would depend on the specifics of the local LLM you are using. Here’s an example using Ollama that someone posted about: