Skip to content

Commit

Permalink
Merge pull request #305 from miurla/readme
Browse files Browse the repository at this point in the history
chore: Modify README to include instructions for local Redis setup
  • Loading branch information
miurla committed Aug 12, 2024
2 parents f68855d + 1ffe439 commit 81fc8cd
Showing 1 changed file with 18 additions and 8 deletions.
26 changes: 18 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ An AI-powered search engine with a generative UI.
- Ollama Provider ([Unstable](https://github.com/miurla/morphic/issues/215))
- Specify the model to generate answers
- Groq API support [](https://github.com/miurla/morphic/pull/58)
- Local Redis support

## 🧱 Stack

Expand All @@ -39,7 +40,7 @@ An AI-powered search engine with a generative UI.
- Generative Model: [OpenAI](https://openai.com/)
- Search API: [Tavily AI](https://tavily.com/) / [Serper](https://serper.dev)
- Reader API: [Jina AI](https://jina.ai/)
- Serverless Database: [Upstash](https://upstash.com/)
- Database (Serverless/Local): [Upstash](https://upstash.com/) / [Redis](https://redis.io/)
- Component library: [shadcn/ui](https://ui.shadcn.com/)
- Headless component primitives: [Radix UI](https://www.radix-ui.com/)
- Styling: [Tailwind CSS](https://tailwindcss.com/)
Expand All @@ -65,6 +66,8 @@ bun install

Follow the guide below to set up Upstash Redis. Create a database and obtain `UPSTASH_REDIS_REST_URL` and `UPSTASH_REDIS_REST_TOKEN`. Refer to the [Upstash guide](https://upstash.com/blog/rag-chatbot-upstash#setting-up-upstash-redis) for instructions on how to proceed.

If you intend to use a local Redis, you can skip this step.

### 4. Fill out secrets

```
Expand Down Expand Up @@ -97,15 +100,23 @@ To use Upstash Redis:
2. Set `UPSTASH_REDIS_REST_URL` and `UPSTASH_REDIS_REST_TOKEN` with your Upstash credentials.
```

_Note: This project focuses on Generative UI and requires complex output from LLMs. Currently, it's assumed that the official OpenAI models will be used. Although it's possible to set up other models, if you use an OpenAI-compatible model, but we don't guarantee that it'll work._

### 5. Run app locally

```
bun dev
```
#### Using Bun

To run the application locally using Bun, execute the following command:

`bun dev`

You can now visit http://localhost:3000.
You can now visit http://localhost:3000 in your web browser.

#### Using Docker

To run the application using Docker, use the following command:

`docker compose up -d`

This will start the application in detached mode. You can access it at http://localhost:3000.

## 🌐 Deploy

Expand Down Expand Up @@ -169,4 +180,3 @@ This will allow you to use Morphic as your default search engine in the browser.
- LLaMA3.1 70B
- LLaMA3 8b
- LLaMA3 70b

0 comments on commit 81fc8cd

Please sign in to comment.