Fix readme and separate dev

laurel/helm
Laurel Orr 2 years ago
parent 01911685f6
commit b34e2a714b

@ -1,7 +1,7 @@
# manifest
Prompt programming with FMs.
# Manifest
How to make prompt programming with FMs a little easier.
# Install
## Install
Download the code:
```bash
git clone git@github.com:HazyResearch/manifest.git
@ -11,29 +11,63 @@ cd manifest
Install:
```bash
pip install poetry
poetry install
poetry run pre-commit install
poetry install --no-dev
```
or
Dev Install:
```bash
pip install poetry
make dev
```
# Run
Manifest is meant to be a very light weight package to help with prompt iteration. Two key design decisions are
## Getting Started
Running is simple to get started. If using OpenAI, set `export OPENAI_API_KEY=<OPENAIKEY>` then run
```python
from manifest import Manifest
# Start a manifest session
manifest = Manifest(
client_name = "openai",
)
manifest.run("Why is the grass green?")
```
We also support AI21, OPT models, and HuggingFace models (see [below](#huggingface-models)).
Caching by default is turned off, but to cache results, run
```python
from manifest import Manifest
# Start a manifest session
manifest = Manifest(
client_name = "openai",
cache_name = "sqlite",
cache_connection = "mycache.sqlite",
)
manifest.run("Why is the grass green?")
```
We also support Redis backend.
## Manifest Components
Manifest is meant to be a very light weight package to help with prompt iteration. Three key design decisions are
* Prompt are functional -- they can take an input example and dynamically change
* All models are behind API calls (e.g., OpenAI)
* Everything is cached for reuse to both save credits and to explore past results
* Everything can cached for reuse to both save credits and to explore past results
## Prompts
### Prompts
A Manifest prompt is a function that accepts a single input to generate a string prompt to send to a model.
```python
from manifest import Prompt
prompt = Prompt(lambda x: "Hello, my name is {x}")
print(prompt("Laurel"))
>>> "Hello, my name is Laurel"
```
We also let you use static strings
```python
prompt = Prompt("Hello, my name is static")
@ -41,9 +75,7 @@ print(prompt())
>>> "Hello, my name is static"
```
**Chaining prompts coming soon**
## Sessions
### Sessions
Each Manifest run is a session that connects to a model endpoint and backend database to record prompt queries. To start a Manifest session for OpenAI, make sure you run
```bash
@ -51,7 +83,7 @@ export OPENAI_API_KEY=<OPENAIKEY>
```
so we can access OpenAI.
Then, in a notebook, run:
Then run:
```python
from manifest import Manifest
@ -104,7 +136,8 @@ If you want to change default parameters to a model, we pass those as `kwargs` t
```python
result = manifest.run(prompt, "Laurel", max_tokens=50)
```
# Huggingface Models
### Huggingface Models
To use a HuggingFace generative model, in `manifest/api` we have a Falsk application that hosts the models for you.
In a separate terminal or Tmux/Screen session, run
@ -117,16 +150,12 @@ You will see the Flask session start and output a URL `http://127.0.0.1:5000`. P
manifest = Manifest(
client_name = "huggingface",
client_connection = "http://127.0.0.1:5000",
cache_name = "redis",
cache_connection = "localhost:6379"
)
```
If you have a custom model you trained, pass the model path to `--model_name`.
**Auto deployment coming soon**
# Development
## Development
Before submitting a PR, run
```bash
export REDIS_PORT="6380" # or whatever PORT local redis is running for those tests

@ -26,12 +26,6 @@ transformers = "^4.19.2"
torch = "^1.8"
requests = "^2.27.1"
tqdm = "^4.64.0"
types-redis = "^4.2.6"
types-requests = "^2.27.29"
types-PyYAML = "^6.0.7"
types-protobuf = "^3.19.21"
types-python-dateutil = "^2.8.16"
types-setuptools = "^57.4.17"
[tool.poetry.dev-dependencies]
black = "^22.3.0"
@ -45,6 +39,12 @@ pytest = "^7.0.0"
pytest-cov = "^3.0.0"
python-dotenv = "^0.20.0"
recommonmark = "^0.7.1"
types-redis = "^4.2.6"
types-requests = "^2.27.29"
types-PyYAML = "^6.0.7"
types-protobuf = "^3.19.21"
types-python-dateutil = "^2.8.16"
types-setuptools = "^57.4.17"
[build-system]
build-backend = "poetry.core.masonry.api"

Loading…
Cancel
Save