Skip to content

Tags: karthink/gptel

Tags

v0.9.6

Toggle v0.9.6's commit message
gptel: Bump version

* gptel.el: Bump version in header, improve package commentary.

v0.9.5

Toggle v0.9.5's commit message
gptel: Bump version

* gptel.el: Bump version to 0.9.5

v0.9.0

Toggle v0.9.0's commit message
gptel: Bump version

* gptel.el: Bump gptel version to 0.9.0.

v0.8.6

Toggle v0.8.6's commit message
gptel: Update description and bump version

gptel.el (header): Update description and bump version.

v0.8.5

Toggle v0.8.5's commit message
gptel: Release v0.8.5

* gptel.el: Bump version.

v0.8.0

Toggle v0.8.0's commit message
gptel: Make model parameters global

* gptel.el (gptel-backend, gptel-model, gptel-temperature,
gptel-max-tokens, gptel--num-messages-to-send,
gptel--system-message): Make all model/request paramters global
variables, i.e. not buffer-local by default.  This is following
the discussion in #249.

* gptel-transient.el (gptel-menu, gptel-system-prompt--setup,
gptel-system-prompt, gptel--suffix-system-message,
gptel--infix-provider, gptel--infix-temperature, gptel--switches,
gptel--set-buffer-locally, gptel--set-with-scope): and associated
transient methods: add a toggle `gptel--set-buffer-locally` to
allow model parameters to be set buffer-locally.  The function
`gptel--set-with-scope` can be used to reset a variable or set it
buffer-locally.

Reorder gptel-transient so all the custom classes, methods and
utility functions are at the top.

* README.org (all backend sections): Replace `setq-default` with
setq in the recommended configuration.

v0.7.0

Toggle v0.7.0's commit message
gptel: Bump version

* gptel.el: Bump version

v0.6.5

Toggle v0.6.5's commit message
gptel: docstrings for multi-LLM support, bump version

* gptel.el (gptel, gptel--insert-response, gptel-temperature,
gptel-pre-response-hook, gptel-response-filter-functions,
gptel-stream, gptel--convert-org, gptel-pre-response-hook,
gptel-max-tokens, gptel-mode, gptel-request,
gptel--insert-response, gptel-set-topic, gptel--convert-org):
Replace mentions of "ChatGPT" with "LLM" or equivalent.  Update
package description and remove commented out obsolete code.

Bump version to 0.6.5.

* gptel-transient.el (gptel--crowdsourced-prompts-url,
gptel--crowdsourced-prompts, gptel-menu, gptel-system-prompt,
gptel-rewrite-menu, gptel--infix-max-tokens,
gptel--suffix-system-message): Ditto.

* gptel-ollama.el (gptel--request-data): Ditto

* gptel-kagi.el (gptel--request-data): Ditto

* gptel-curl.el (gptel-curl--stream-insert-response,
gptel-curl--stream-cleanup, gptel-curl--stream-filter): Ditto

v0.6.0

Toggle v0.6.0's commit message
gptel-kagi: Add support for Kagi FastGPT

* gptel.el: Bump version and update package description.

* gptel-kagi.el (gptel--parse-response, gptel--request-data,
gptel--parse-buffer, gptel-make-kagi): Add new file and support
for the Kagi FastGPT LLM API.  Streaming and setting model
parameters (temperature, max tokesn) are not supported by the API.
A Kagi backend can be added with `gptel-make-kagi`.

* README.org: Update with instructions for Kagi.

v0.5.5

Toggle v0.5.5's commit message
gptel: Bump version