AI pair programming & LLMs chats in Emacs
Content from the webinar slides for easier browsing.
A few warnings
Warning about my setup
I normally don’t share my Emacs init code because it relies on a remapping of the semi-colon as a ring-map which makes my keybindings absurd on another machine. I also use straight.el for packages installations and it itself requires to be installed first.
But I keep being asked for my code after each webinar. So this time, I am sharing it.
Don’t copy-paste it in your init file: it would break your Emacs. Instead, use it to inspire your potential setup or, better still, go to the packages READMEs. They provide a much better place to start.
You have been warned… 🙂
Privacy warning
Most companies offer a free tier. These may come with lower speed, limited tokens, older models.
The free tier always come with a lack of privacy: your LLM usage is used for model training.
Paid services normally come with more privacy protections, but the landscape is fast evolving and accidents happen. Be mindful of what you type when you interact with LLMs. Do not type sensitive information in your prompt.
Safe API key storage
Your LLMs API keys, login credentials, or passwords should never appear as plain text in your init file.
Some packages provide mechanisms for safe storing or logging. Others don’t. In that case, there are multiple options. My favourites are based on auth-source and GPG.
authinfo.gpg
Create a ~/.authinfo.gpg file with your keys in the format:
machine <hostname> login <username> password <password>If you need a function to retrieve your key, use:
(lambda ()
(auth-source-pick-first-password
:host "<hostname>"
:user "<username>"))If you need a string, simply use:
(auth-source-pick-first-password
:host "<hostname>"
:user "<username>")pass
Use the standard Unix password manager to store your API key by running in a Unix shell:
pass insert <key-name>
# enter your API key twice when promptedFunction:
(lambda ()
(auth-source-pass-get 'secret "<key-name>"))String:
(auth-source-pass-get 'secret "<key-name>")copilot.el
GitHub copilot code completion (built on top of the GitHub copilot-language-server).
Requirements
- Access to GitHub copilot
Free tier available to everybody
Pro available free for students, teachers, maintainers of popular open-source projects
Paid subscriptions for Pro and Pro+
- Emacs ≥ 27
- Node.js ≥ 22
Installation
- Install and load the package and dependency with your favourite method.
- Install the copilot server with
M-x copilot-install-server. - Login to Copilot with
M-x copilot-loginand follow the instructions.
You can test the setup with M-x copilot-diagnose.
My setup (see warning)
;; dependency
(straight-use-package 'editorconfig)
(use-package copilot
:straight (:host github
:repo "copilot-emacs/copilot.el"
:files ("dist" "*.el"))
:bind (("C-8" . copilot-complete)
("; j c" . copilot-mode)
:map copilot-completion-map
("C-j" . copilot-accept-completion)
("C-f" . copilot-accept-completion-by-word)
("C-t" . copilot-accept-completion-by-line)
("M-n" . copilot-next-completion)
("M-p" . copilot-previous-completion)))copilot-chat.el
GitHub copilot chat
Requirements
Access to GitHub copilot.
Functionality
Chat with a model.
Markdown or Org markup
Chats can be saved and restored
Buffers can be added or removed as context
Can choose model
AI pair programming.
Write tests
Explain code/function at point/symbol at point
Review code
Document code
Fix code
Optimize code
Modify code
Customize prompts
Generate commit messages.
My setup (see warning)
;; dependency
(straight-use-package 'magit)
(use-package copilot-chat
:straight (:host github:repo "chep/copilot-chat.el" :files ("*.el"))
:after (org markdown-mode)
:bind (("; c c" . copilot-chat)
("; c y" . copilot-chat-yank)
("; c m" . copilot-chat-set-model)
:map prog-mode-map
;; explain symbol under point
("; c e s" . copilot-chat-explain-symbol-at-line)
;; explain function under point
("; c e f" . copilot-chat-explain-defun)
;; explain selected code
("; c e c" . copilot-chat-explain)
;; review selected code
("; c r c" . copilot-chat-review)
;; review current buffer
("; c r b" . copilot-chat-review-whole-buffer)
;; document selected code
("; c d" . copilot-chat-doc)
;; fix selected code
("; c f c" . copilot-chat-fix)
;; optimize selected code
("; c o" . copilot-chat-optimize)
;; write tests for selected code
("; c t" . copilot-chat-test)
;; apply a custom prompt to the function body under point
;; (instruct on how to refactor the function)
("; c f f" . copilot-chat-custom-prompt-function)
:map copilot-chat-org-prompt-mode-map
("C-<return>" . copilot-chat-prompt-send)
:map org-mode-map
("; c g" . copilot-chat-prompt-split-and-list)))gptel
Access LLMs from any buffer.
Requirements
API key(s) for model(s), GitHub copilot, or model(s) running locally.
Functionality
Chat in dedicated buffer or from any buffer.
Markdown or Org markup
Choose model
Add/remove context (including media)
Set temperature
A number of packages are built on top of gptel
My setup (see warning)
(use-package gptel
:config
(setq
gptel-model 'gemini-2.5-pro
gptel-backend (gptel-make-gemini "Gemini"
:key #'gptel-api-key-from-auth-source
:stream t))
:bind (("; g g" . gptel)
("; g s" . gptel-send)
("; g m" . gptel-menu)))mcp.el
Emacs client for MCP.
Functionality
mcp.el integrates easily with gptel and copilot-chat.el.
The LLMs you use can then access any MCP server you setup.
My setup (see warning)
I use the Context7 MCP server.
(use-package mcp
:after (:any gptel copilot-chat)
:custom (mcp-hub-servers
`(("context7" . (:command "npx"
:args ("-y" "@upstash/context7-mcp@latest" "--api-key" ,(auth-source-pick-first-password
:host "context7_api_key"
:user "secret"))))))
:config (require 'mcp-hub)
:hook (after-init . mcp-hub-start-all-server))Integration with gptel
To have gptel integrate with mcp.el and automatically use the servers you have set, add to the gptel :config declaration:
(require 'gptel-integrations) ; always needed
(gptel-mcp-connect) ; to connect automaticallyAlternatively, you can call gptel-mcp-connect and gptel-mcp-disconnect to enable/disable servers manually.
Integration with copilot-chat.el
Use copilot-chat-set-mcp-servers to enable or disable servers.
chatgpt-shell
A shell to access LLMs.
Requirements
API key(s) for model(s) or local model(s).
Functionality
Chat in an Emacs shell.
Choose and swap models
Describe code
Proofread text
Write commits messages
Save/restore transcripts
My setup (see warning)
;; dependency
(use-package shell-maker
:straight (:type git :host github :repo "xenodium/shell-maker"))
(use-package chatgpt-shell
:straight (:type git :host github
:repo "xenodium/chatgpt-shell"
:files ("chatgpt-shell*.el"))
:init
(setq chatgpt-shell-google-key
(lambda ()
(auth-source-pick-first-password
:host "google_api_key" :user "secret")))
:bind ("; c s" . chatgpt-shell))aidermacs
aider in Emacs (forget about Cursor).
Requirements
Functionality
- Ediff of AI-generated changes
- Custom prompts
- Code/chat/help/architect modes
- Integrates with vterm
- Auto-detects project root
- Voice commands
- Retrieves web content
- Writes tests
- Debugs code
- Weak model for fast easy tasks
- TRAMP support
- Easy passing of aider options
My setup (see warning)
(use-package aidermacs
:bind (("; c a" . aidermacs-transient-menu))
:config
(setenv "GOOGLE_API_KEY" (auth-source-pick-first-password
:host "google_api_key"
:user "secret"))
:custom
(aidermacs-default-chat-mode 'architect)
(aidermacs-default-model "gemini"))