Tags: karthink/gptel
Tags
gptel: Make model parameters global * gptel.el (gptel-backend, gptel-model, gptel-temperature, gptel-max-tokens, gptel--num-messages-to-send, gptel--system-message): Make all model/request paramters global variables, i.e. not buffer-local by default. This is following the discussion in #249. * gptel-transient.el (gptel-menu, gptel-system-prompt--setup, gptel-system-prompt, gptel--suffix-system-message, gptel--infix-provider, gptel--infix-temperature, gptel--switches, gptel--set-buffer-locally, gptel--set-with-scope): and associated transient methods: add a toggle `gptel--set-buffer-locally` to allow model parameters to be set buffer-locally. The function `gptel--set-with-scope` can be used to reset a variable or set it buffer-locally. Reorder gptel-transient so all the custom classes, methods and utility functions are at the top. * README.org (all backend sections): Replace `setq-default` with setq in the recommended configuration.
gptel: docstrings for multi-LLM support, bump version * gptel.el (gptel, gptel--insert-response, gptel-temperature, gptel-pre-response-hook, gptel-response-filter-functions, gptel-stream, gptel--convert-org, gptel-pre-response-hook, gptel-max-tokens, gptel-mode, gptel-request, gptel--insert-response, gptel-set-topic, gptel--convert-org): Replace mentions of "ChatGPT" with "LLM" or equivalent. Update package description and remove commented out obsolete code. Bump version to 0.6.5. * gptel-transient.el (gptel--crowdsourced-prompts-url, gptel--crowdsourced-prompts, gptel-menu, gptel-system-prompt, gptel-rewrite-menu, gptel--infix-max-tokens, gptel--suffix-system-message): Ditto. * gptel-ollama.el (gptel--request-data): Ditto * gptel-kagi.el (gptel--request-data): Ditto * gptel-curl.el (gptel-curl--stream-insert-response, gptel-curl--stream-cleanup, gptel-curl--stream-filter): Ditto
gptel-kagi: Add support for Kagi FastGPT * gptel.el: Bump version and update package description. * gptel-kagi.el (gptel--parse-response, gptel--request-data, gptel--parse-buffer, gptel-make-kagi): Add new file and support for the Kagi FastGPT LLM API. Streaming and setting model parameters (temperature, max tokesn) are not supported by the API. A Kagi backend can be added with `gptel-make-kagi`. * README.org: Update with instructions for Kagi.
PreviousNext