GNU ELPA - ellama

ellama Atom Feed

Description
Tool for interacting with LLMs
Latest
ellama-0.13.0.tar (.sig), 2024-Nov-30, 180 KiB
Maintainer
Sergey Kostyaev <sskostyaev@gmail.com>
Website
http://github.com/s-kostyaev/ellama
Browse ELPA's repository
CGit or Gitweb
Badge

To install this package from Emacs, use package-install or list-packages.

Full description

1. Ellama

license-GPL_3-green.svg ellama-badge.svg ellama-badge.svg ellama.svg

Ellama is a tool for interacting with large language models from Emacs. It allows you to ask questions and receive responses from the LLMs. Ellama can perform various tasks such as translation, code review, summarization, enhancing grammar/spelling or wording and more through the Emacs interface. Ellama natively supports streaming output, making it effortless to use with your preferred text editor.

The name "ellama" is derived from "Emacs Large LAnguage Model Assistant". Previous sentence was written by Ellama itself.

1.1. Installation

Just M-x package-install Enter ellama Enter. By default it uses ollama provider and zephyr model. If you ok with it, you need to install ollama and pull zephyr like this:

ollama pull zephyr

You can use ellama with other model or other llm provider. In that case you should customize ellama configuration like this:

(use-package ellama
    :bind ("C-c e" . ellama-transient-main-menu)
    :init
    ;; setup key bindings
    ;; (setopt ellama-keymap-prefix "C-c e")
    ;; language you want ellama to translate to
    (setopt ellama-language "German")
    ;; could be llm-openai for example
    (require 'llm-ollama)
    (setopt ellama-provider
	(make-llm-ollama
	     ;; this model should be pulled to use it
	     ;; value should be the same as you print in terminal during pull
	     :chat-model "llama3:8b-instruct-q8_0"
	     :embedding-model "nomic-embed-text"
	     :default-chat-non-standard-params '(("num_ctx" . 8192))))
    (setopt ellama-summarization-provider
	    (make-llm-ollama
	     :chat-model "qwen2.5:3b"
	     :embedding-model "nomic-embed-text"
	     :default-chat-non-standard-params '(("num_ctx" . 32768))))
    (setopt ellama-coding-provider
	    (make-llm-ollama
	     :chat-model "qwen2.5-coder:3b"
	     :embedding-model "nomic-embed-text"
	     :default-chat-non-standard-params '(("num_ctx" . 32768))))
    ;; Predefined llm providers for interactive switching.
    ;; You shouldn't add ollama providers here - it can be selected interactively
    ;; without it. It is just example.
    (setopt ellama-providers
	    '(("zephyr" . (make-llm-ollama
			   :chat-model "zephyr:7b-beta-q6_K"
			   :embedding-model "zephyr:7b-beta-q6_K"))
	      ("mistral" . (make-llm-ollama
			    :chat-model "mistral:7b-instruct-v0.2-q6_K"
			    :embedding-model "mistral:7b-instruct-v0.2-q6_K"))
	      ("mixtral" . (make-llm-ollama
			    :chat-model "mixtral:8x7b-instruct-v0.1-q3_K_M-4k"
			    :embedding-model "mixtral:8x7b-instruct-v0.1-q3_K_M-4k"))))
    ;; Naming new sessions with llm
    (setopt ellama-naming-provider
	    (make-llm-ollama
	     :chat-model "llama3:8b-instruct-q8_0"
	     :embedding-model "nomic-embed-text"
	     :default-chat-non-standard-params '(("stop" . ("\n")))))
    (setopt ellama-naming-scheme 'ellama-generate-name-by-llm)
    ;; Translation llm provider
    (setopt ellama-translation-provider
	  (make-llm-ollama
	   :chat-model "qwen2.5:3b"
	   :embedding-model "nomic-embed-text"
	   :default-chat-non-standard-params
	   '(("num_ctx" . 32768))))
    ;; customize display buffer behaviour
    ;; see ~(info "(elisp) Buffer Display Action Functions")~
    (setopt ellama-chat-display-action-function #'display-buffer-full-frame)
    (setopt ellama-instant-display-action-function #'display-buffer-at-bottom)
    :config
    ;; send last message in chat buffer with C-c C-c
    (add-hook 'org-ctrl-c-ctrl-c-hook #'ellama-chat-send-last-message))

1.2. Commands

1.2.1. ellama-chat

Ask Ellama about something by entering a prompt in an interactive buffer and continue conversation. If called with universal argument (C-u) will start new session with llm model interactive selection.

1.2.2. ellama-chat-send-last-message

Send last user message extracted from current ellama chat buffer.

1.2.3. ellama-ask-about

Ask Ellama about a selected region or the current buffer.

1.2.4. ellama-ask-selection

Send selected region or current buffer to ellama chat.

1.2.5. ellama-ask-line

Send current line to ellama chat.

1.2.6. ellama-complete

Complete text in current buffer with ellama.

1.2.7. ellama-translate

Ask Ellama to translate a selected region or word at the point.

1.2.8. ellama-translate-buffer

Translate current buffer.

1.2.9. ellama-define-word

Find the definition of the current word using Ellama.

1.2.10. ellama-summarize

Summarize a selected region or the current buffer using Ellama.

1.2.11. ellama-summarize-killring

Summarize text from the kill ring.

1.2.12. ellama-code-review

Review code in a selected region or the current buffer using Ellama.

1.2.13. ellama-change

Change text in a selected region or the current buffer according to a provided change.

1.2.14. ellama-make-list

Create a markdown list from the active region or the current buffer using Ellama.

1.2.15. ellama-make-table

Create a markdown table from the active region or the current buffer using Ellama.

1.2.16. ellama-summarize-webpage

Summarize a webpage fetched from a URL using Ellama.

1.2.17. ellama-provider-select

Select ellama provider.

1.2.18. ellama-code-complete

Complete selected code or code in the current buffer according to a provided change using Ellama.

1.2.19. ellama-code-add

Add new code according to a description, generating it with a provided context from the selected region or the current buffer using Ellama.

1.2.20. ellama-code-edit

Change selected code or code in the current buffer according to a provided change using Ellama.

1.2.21. ellama-code-improve

Change selected code or code in the current buffer according to a provided change using Ellama.

1.2.22. ellama-generate-commit-message

Generate commit message based on diff.

1.2.23. ellama-improve-wording

Enhance the wording in the currently selected region or buffer using Ellama.

1.2.24. ellama-improve-grammar

Enhance the grammar and spelling in the currently selected region or buffer using Ellama.

1.2.25. ellama-improve-conciseness

Make the text of the currently selected region or buffer concise and simple using Ellama.

1.2.26. ellama-make-format

Render the currently selected text or the text in the current buffer as a specified format using Ellama.

1.2.27. ellama-load-session

Load ellama session from file.

1.2.28. ellama-session-remove

Remove ellama session.

1.2.29. ellama-session-switch

Change current active session.

1.2.30. ellama-session-rename

Rename current ellama session.

1.2.31. ellama-context-add-file

Add file to context.

1.2.32. ellama-context-add-buffer

Add buffer to context.

1.2.33. ellama-context-add-selection

Add selected region to context.

1.2.34. ellama-context-add-info-node

Add info node to context.

1.2.35. ellama-chat-translation-enable

Chat translation enable.

1.2.36. ellama-chat-translation-disable

Chat translation disable.

1.2.37. ellama-solve-reasoning-problem

Solve reasoning problem with Absctraction of Thought technique. It uses a chain of multiple messages to LLM and help it to provide much better answers on reasoning problems. Even small LLMs like phi3-mini provides much better results on reasoning tasks using AoT.

1.2.38. ellama-solve-domain-specific-problem

Solve domain specific problem with simple chain. It makes LLMs act like a professional and adds a planning step.

1.3. Keymap

Here is a table of keybindings and their associated functions in Ellama, using the ellama-keymap-prefix prefix (not set by default):

Keymap Function Description
"c c" ellama-code-complete Code complete
"c a" ellama-code-add Code add
"c e" ellama-code-edit Code edit
"c i" ellama-code-improve Code improve
"c r" ellama-code-review Code review
"c m" ellama-generate-commit-message Generate commit message
"s s" ellama-summarize Summarize
"s w" ellama-summarize-webpage Summarize webpage
"s c" ellama-summarize-killring Summarize killring
"s l" ellama-load-session Session Load
"s r" ellama-session-rename Session rename
"s d" ellama-session-remove Session delete
"s a" ellama-session-switch Session activate
"i w" ellama-improve-wording Improve wording
"i g" ellama-improve-grammar Improve grammar and spelling
"i c" ellama-improve-conciseness Improve conciseness
"m l" ellama-make-list Make list
"m t" ellama-make-table Make table
"m f" ellama-make-format Make format
"a a" ellama-ask-about Ask about
"a i" ellama-chat Chat (ask interactively)
"a l" ellama-ask-line Ask current line
"a s" ellama-ask-selection Ask selection
"t t" ellama-translate Text translate
"t b" ellama-translate-buffer Translate buffer
"t e" ellama-chat-translation-enable Translation enable
"t d" ellama-chat-translation-disable Translation disable
"t c" ellama-complete Text complete
"d w" ellama-define-word Define word
"x b" ellama-context-add-buffer Context add buffer
"x f" ellama-context-add-file Context add file
"x s" ellama-context-add-selection Context add selection
"x i" ellama-context-add-info-node Context add info node
"p s" ellama-provider-select Provider select

1.4. Configuration

The following variables can be customized for the Ellama client:

  • ellama-enable-keymap: Enable the Ellama keymap.
  • ellama-keymap-prefix: The keymap prefix for Ellama.
  • ellama-user-nick: The user nick in logs.
  • ellama-assistant-nick: The assistant nick in logs.
  • ellama-language: The language for Ollama translation. Default

language is english.

  • ellama-provider: llm provider for ellama. Default provider is

ollama with zephyr model. There are many supported providers: ollama, open ai, vertex, GPT4All. For more information see llm documentation.

  • ellama-providers: association list of model llm providers with name as key.
  • ellama-spinner-type: Spinner type for ellama. Default type is

progress-bar.

  • ellama-ollama-binary: Path to ollama binary.
  • ellama-auto-scroll: If enabled ellama buffer will scroll automatically during generation. Disabled by default.
  • ellama-fill-paragraphs: Option to customize ellama paragraphs filling behaviour.
  • ellama-name-prompt-words-count: Count of words in prompt to generate name.
  • Prompt templates for every command.
  • ellama-chat-done-callback: Callback that will be called on ellama

chat response generation done. It should be a function with single argument generated text string.

  • ellama-nick-prefix-depth: User and assistant nick prefix depth. Default value is 2.
  • ellama-sessions-directory: Directory for saved ellama sessions.
  • ellama-major-mode: Major mode for ellama commands. Org mode by default.
  • ellama-long-lines-length: Long lines length for fill paragraph call. Too low value can break generated code by splitting long comment lines. Default value 100.
  • ellama-session-auto-save: Automatically save ellama sessions if set. Enabled by default.
  • ellama-naming-scheme: How to name new sessions.
  • ellama-naming-provider: LLM provider for generating session names by LLM. If not set ellama-provider will be used.
  • ellama-chat-translation-enabled: Enable chat translations if set.
  • ellama-translation-provider: LLM translation provider. ellama-provider will be used if not set.
  • ellama-coding-provider: LLM coding tasks provider. ellama-provider will be used if not set.
  • ellama-summarization-provider LLM summarization provider. ellama-provider will be used if not set.
  • ellama-show-quotes: Show quotes content in chat buffer. Disabled by default.
  • ellama-chat-display-action-function: Display action function for ellama-chat.
  • ellama-instant-display-action-function: Display action function for ellama-instant.

1.5. Acknowledgments

Thanks Jeffrey Morgan for excellent project ollama. This project cannot exist without it.

Thanks zweifisch - I got some ideas from ollama.el what ollama client in Emacs can do.

Thanks Dr. David A. Kunz - I got more ideas from gen.nvim.

Thanks Andrew Hyatt for llm library. Without it only ollama would be supported.

2. Contributions

To contribute, submit a pull request or report a bug. This library is part of GNU ELPA; major contributions must be from someone with FSF papers. Alternatively, you can write a module and share it on a different archive like MELPA.

Old versions

ellama-0.12.8.tar.lz2024-Nov-2833.2 KiB
ellama-0.12.7.tar.lz2024-Nov-2433.1 KiB
ellama-0.12.6.tar.lz2024-Nov-2333.1 KiB
ellama-0.12.5.tar.lz2024-Nov-1833.1 KiB
ellama-0.12.4.tar.lz2024-Oct-2632.9 KiB
ellama-0.12.2.tar.lz2024-Oct-1632.4 KiB
ellama-0.11.14.tar.lz2024-Sep-1531.8 KiB
ellama-0.11.0.tar.lz2024-Jul-0230.5 KiB
ellama-0.10.2.tar.lz2024-Jun-3030.4 KiB
ellama-0.9.11.tar.lz2024-Jun-2629.3 KiB
ellama-0.9.0.tar.lz2024-Apr-0326.7 KiB
ellama-0.8.13.tar.lz2024-Mar-3125.7 KiB
ellama-0.8.1.tar.lz2024-Feb-1124.1 KiB
ellama-0.7.7.tar.lz2024-Feb-1022.9 KiB
ellama-0.7.0.tar.lz2024-Jan-2020.3 KiB
ellama-0.6.0.tar.lz2024-Jan-1820.1 KiB
ellama-0.5.7.tar.lz2024-Jan-1618.7 KiB
ellama-0.4.13.tar.lz2023-Dec-2818.0 KiB
ellama-0.4.0.tar.lz2023-Dec-1817.4 KiB
ellama-0.3.2.tar.lz2023-Dec-1816.9 KiB

News

1. Version 0.13.0

  • Add command ellama-chat-send-last-message to compose and modify messages within the chat buffer and send them directly from there.
  • Add ellama-chat-display-action-function and ellama-instant-display-action-function custom variables to customize display buffers behaviour.

2. Version 0.12.8

  • Provide code review in chat session.
  • Improve code review prompt template.

3. Version 0.12.7

  • Add change command to transient menu.

4. Version 0.12.6

  • Fixed transient dependency version.
  • Refined the code for lazy loading some dependencies.

5. Version 0.12.5

  • Add coding provider customization option.

6. Version 0.12.4

  • Fix documentation.
  • Improve translation template.
  • Improve commit message template.

7. Version 0.12.3

  • Add separated summarization provider customization option.
  • Improve summarization prompt template.

8. Version 0.12.2

  • Add problem solving chains to transient menu.

9. Version 0.12.1

  • Fix bug when user can't create new session with universal prefix argument.

10. Version 0.12.0

  • Add transient menu.

11. Version 0.11.14

  • Add interactive template modification for ellama-improve-* functions with universal prefix argument.

12. Version 0.11.13

  • Add ability to use sessions in other elisp packages.

13. Version 0.11.12

  • Fix ellama providers validation.

14. Version 0.11.11

  • Fix llm provider custom variables types.

15. Version 0.11.10

  • Fix commit message generation for partial commits.

16. Version 0.11.9

  • Fix issue when current window was changed after calling ellama-generate-commit-message.
  • Add ellama-generate-commit-message to keymap.

17. Version 0.11.8

  • Allow ollama-binary to accept the executable's base name.

18. Version 0.11.7

  • Add commit message generation.

19. Version 0.11.6

  • Add link to quoted content in a separate buffer.

20. Version 0.11.5

  • Prevent unnecessary line breaks at the end of generated text.

21. Version 0.11.4

  • Improve code templates and auto-naming.

22. Version 0.11.3

  • Fix autoscrolling for editing commands.

23. Version 0.11.2

  • Inability to use closures on stream done is fixed.

24. Version 0.11.1

  • Add function ellama-context-add-text for non-intercative usage.

25. Version 0.11.0

  • Refactor markdown to org conversion code. Now all transformations will be applied only outside of code blocks.

26. Version 0.10.2

  • Fix bug when translation from markdown to org syntax breaks python code blocks.

27. Version 0.10.1

  • Add ellama-solve-domain-specific-problem command. It leverages the popular "act like a professional" prompt engineering method, enhanced by an automated planning step.

28. Version 0.10.0

  • Add ellama-solve-reasoning-problem command that implements Absctraction of Thought technique. It uses a chain of multiple messages to LLM and help it to provide much better answers on reasoning problems. Even small LLMs like phi3-mini provides much better results on reasoning tasks using AoT.

29. Version 0.9.11

  • Transform org quote content to avoid rendering issues.

30. Version 0.9.10

  • Add file quote context elements.

31. Version 0.9.9

  • Add info node quote context elements.

32. Version 0.9.8

  • Copy context from previous session on creating new session. This is useful when you create new session by calling ellama-ask-about with prefix argument.

33. Version 0.9.7

  • Add webpage quote context elements.

34. Version 0.9.6

  • Improve code blocks translation from markdown to org.

35. Version 0.9.5

  • Establish a fresh chat session whenever the ellama-chat function is invoked with a provider different from the one currently in use.

36. Version 0.9.4

  • Improve code blocks translation from markdown to org.

37. Version 0.9.3

  • Support summarize shr url at point (eww and elfeed).
  • Add ellama-chain function for chaining multiple calls to LLMs.

38. Version 0.9.2

  • Allow summarizing urls withoud doctype tag.
  • Summarize url at point.

39. Version 0.9.1

  • Add summarize killring command.

40. Version 0.9.0

  • Improve context management. Make it extendable.

… …