- read

Creating a simple ChatGPT clone with Go

Sau Sheong 145

Sau Sheong
Stackademic
Published in
4 min readAug 6

--

I’ve been writing my LLM applications using Python for the past few months, because of the rich and extensive ecosystem that exists in the Python community. Langchain, LlamaIndex, HuggingFace and a whole slew of libraries exist to provide everything a programmer needs to write LLM applications.

However, at heart I’m still a Go programmer. I’ve been looking out for a chance to get back to Go whenever I can, and I think I found it in the form of a Langchain Go library called langchaingo. You didn’t have high expectations of a fancy name, right?

Anyway, I was quite surprised and very pleased at how comprehensive it was and promptly wrote a simple ChatGPT clone with it (ChatGPT clones seems to be the Hello World of LLM applications). It’s very straightforward and easy. Let me just show you the code.

package main

import (
"context"
"encoding/json"
"log"
"net/http"
"os"
"text/template"

"github.com/go-chi/chi"
"github.com/go-chi/chi/middleware"
"github.com/joho/godotenv"
"github.com/tmc/langchaingo/llms/openai"
"github.com/tmc/langchaingo/schema"
)

// initialise to load environment variable from .env file
func init() {
err := godotenv.Load()
if err != nil {
log.Fatal("Error loading .env file")
}
}

func main() {
r := chi.NewRouter()
r.Use(middleware.Logger)
r.Handle("/static/*", http.StripPrefix("/static",
http.FileServer(http.Dir("./static"))))
r.Get("/", index)
r.Post("/run", run)
log.Println("\033[93mBreeze started. Press CTRL+C to quit.\033[0m")
http.ListenAndServe(":"+os.Getenv("PORT"), r)
}

// index
func index(w http.ResponseWriter, r *http.Request) {
t, _ := template.ParseFiles("static/index.html")
t.Execute(w, nil)
}

// call the LLM and return the response
func run(w http.ResponseWriter, r *http.Request) {
prompt := struct {
Input string `json:"input"`
}{}
// decode JSON from client
err := json.NewDecoder(r.Body).Decode(&prompt)
if err != nil {
http.Error(w, err.Error(), http.StatusBadRequest)
return
}
// create the LLM
llm, err := openai.NewChat(openai.WithModel(os.Getenv("OPENAI_MODEL")))
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}

chatmsg := []schema.ChatMessage{…