Ollama Local LLM Extension

I’ve been using PopClip well.

I recently installed Ollama locally to replace ChatGPT. Ollama is an LLM that you install locally and can be used like the link below.

https://github.com/ollama/ollama/blob/main/docs/api.md

I modified the ChatGPT popclip source to fix the api connection part.


// #popclip
// name: Ollama Chat
// icon: ollama.png
// identifier: js.popclip.extension.ollama.chat
// description: Send the selected text to Ollama LOCAL API and append the response.
// app: { name: Ollama, link: 'https://github.com/ollama/ollama/blob/main/docs/api.md' }
// popclipVersion: 4586
// keywords: Ollama chat
// entitlements: [network]
// language: typescript

import axios from "axios";

export const options: Option[] = [
  {
    identifier: "model",
    label: "Model",
    type: "multiple",
    defaultValue: "mistral",
    values: ["mistral", "another-model"],
  },
  {
    identifier: "systemMessage",
    label: "System Message",
    type: "string",
    description:
      "Optional system message to specify the behaviour of the AI agent.",
  },
  {
    identifier: "resetMinutes",
    label: "Reset Timer (minutes)",
    type: "string",
    description:
      "Reset the conversation if idle for this many minutes. Set blank to disable.",
    defaultValue: "15",
  },
  {
    identifier: "showReset",
    label: "Show Reset Button",
    type: "boolean",
    icon: "broom-icon.svg",
    description: "Show a button to reset the conversation.",
  },
];

type OptionsShape = {
  resetMinutes: string;
  model: string;
  systemMessage: string;
  showReset: boolean;
};

// typescript interfaces for Ollama API
interface Message {
  role: "user" | "system" | "assistant";
  content: string;
}
interface Response {
  model: string;
  created_at: string;
  response: string;
  done: boolean;
  done_reason: string;
  context: Array<number>;
  total_duration: number;
  load_duration: number;
  prompt_eval_count: number;
  prompt_eval_duration: number;
  eval_count: number;
  eval_duration: number;
}

// the extension keeps the message history in memory
const messages: Array<Message> = [];

// the last chat date
let lastChat: Date = new Date();

// reset the history
function reset() {
  print("Resetting chat history");
  messages.length = 0;
}

// get the content of the last `n` messages from the chat, trimmed and separated by double newlines
function getTranscript(n: number): string {
  return messages
    .slice(-n)
    .map((m) => m.content.trim())
    .join("\n\n");
}

// the main chat action
const chat: ActionFunction<OptionsShape> = async (input, options) => {
  const ollama = axios.create({
    baseURL: "http://localhost:11434/api",
  });

  // if the last chat was long enough ago, reset the history
  if (options.resetMinutes.length > 0) {
    const resetInterval = parseInt(options.resetMinutes) * 1000 * 60;
    if (new Date().getTime() - lastChat.getTime() > resetInterval) {
      reset();
    }
  }

  if (messages.length === 0) {
    // add the system message to the start of the conversation
    let systemMessage = options.systemMessage.trim();
    if (systemMessage) {
      messages.push({ role: "system", content: systemMessage });
    }
  }

  // add the new message to the history
  messages.push({ role: "user", content: input.text.trim() });

  // send the whole message history to Ollama
  try {
    const { data }: { data: Array<Response> } = await ollama.post("/generate", {
      model: options.model || "mistral",
      prompt: getTranscript(messages.length),
      format: "json",
      stream: false,
    });

    // combine the responses
    const combinedResponse = data.response;

    // add the response to the history
    messages.push({ role: "assistant", content: combinedResponse });
    lastChat = new Date();

    // if holding shift and alt, paste just the response.
    // if holding shift, copy just the response.
    // else, paste the last input and response.
    if (popclip.modifiers.shift && popclip.modifiers.option) {
      popclip.pasteText(getTranscript(1));
    } else if (popclip.modifiers.shift) {
      popclip.copyText(getTranscript(1));
    } else {
      popclip.pasteText(getTranscript(2));
      popclip.showSuccess();
    }
  } catch (e) {
    popclip.showText(getErrorInfo(e));
  }
};

export function getErrorInfo(error: unknown): string {
  if (typeof error === "object" && error !== null && "response" in error) {
    const response = (error as any).response;
    return `Message from Ollama (code ${response.status}): ${response.data.error.message}`;
  } else {
    return String(error);
  }
}

// export the actions
export const actions: Action<OptionsShape>[] = [
  {
    title: "Chat",
    code: chat,
  },
  {
    title: "Reset Chat",
    icon: "broom-icon.svg",
    stayVisible: true,
    requirements: ["option-showReset=1"],
    code: reset,
  },
];

When executed, the following error occurs

Only HTTPS connections are allowed, causing HTTP connections to fail. I was wondering if there is a workaround.

1 Like

The HTTPS restriction is due to App Transport Security. However, I think localhost non-TLS connections should be allowed for exactly this kind of use case. I beleive it’s possible to add an exception for localhost, but this will require a new app build. I will look into it.

Thank you for your quick check. We’ll test again once the fix has been applied.

I have just pushed an update (Build 4659) to PopClip Beta. If you get a chance to try it, please let me know it resolves the issue accessing http://localhost for you.

1 Like

The beta version works well. Local Ollama responded good. Thank you for your quick response.

1 Like

Where can I download this extension?

Select all the above code written in the code block and popclip prompt you to install extension.

1 Like

Thank you. It’s the same way installing snippets.

To install it as a snippet, it just needs the // language: typescript field added in the header. I’ve edited the original post to add that.

2 Likes

Hello everyone,

I’m trying to use your extension, but unfortunately I’m having trouble :cry:

I select my text and click on the extension, but nothing happens. Is there anything I need to configure in particular? Is there a loading indicator while the LLM is responding? Do we agree that I select my text, and it will be processed by the LLM, then pasted under the selected text?

Thank you in advance for your feedback.


Bonjour à tous,

Je tente d’utiliser votre extension, mais malheureusement, je rencontre des difficultés :cry:

Je sélectionne mon texte et clique sur l’extension, mais rien ne se passe. Y a-t-il quelque chose à configurer en particulier ? Y a-t-il un indicateur de chargement pendant que le LLM répond ? Sommes-nous d’accord que je sélectionne mon texte, et que celui-ci sera traité par le LLM, puis collé sous le texte sélectionné ?

Je vous remercie par avance pour votre retour.

Ollama has made an update that lets you interact via a window, as ChatGPT does. Perhaps it is easier today to achieve something? I tried again to get the given code to work, but nothing happens :confused:


Ollama a effectué une mise à jour qui permet d’interagir via une fenêtre, comme le fait ChatGPT. Peut-être est-il plus facile aujourd’hui de réaliser quelque chose ? J’ai tenté une nouvelle fois de faire fonctionner le code donné, mais rien ne se passe :confused:

Is there any error or other output in the Console window? (see Debug output here PopClip Extensions Developer Reference)

I’m not sure if I did it correctly, but here is the error that came up. :

require(axios): Using cached module (/Applications/PopClip.app/Contents/Resources/js_bundles/axios.js.lzfse)

That is actually a normal disgnostic output rather than an error. It’s a bigger task to get into this than I have bandwith for at the moment esepcially as I don’t have Ollama myself to test, but I hope someone can help.

I’d like to reiterate my request… Is it possible to get a little help with the extension?

Today I have published an official Ollama extension.

It’s working for me. @hercut – give it a go and see how you get on!

Wow!!!
I’ll give it a quick try and get back to you! I’m on vacation but will be able to do it next week!

Thank you so much !!!

The plugin seems to be working correctly with gemma3:4b !

Well, I asked it in the prompt to “correct and improve my text” and it sends me suggestions.
I probably need to improve this prompt :blush:

Thank you for your help!

1 Like