Content fetch and aggregation bot for hugo data-driven websites
您最多能選擇 25 個主題 主題必須以字母或數字為開頭,可包含連接號「-」且最長為 35 個字元。
 
 
 
 
blob42 c4ed0c48ba migrate module location 1 年前
bitcoin migrate module location 1 年前
config initial 5 年前
db initial 5 年前
encoder initial 5 年前
export migrate module location 1 年前
feeds migrate module location 1 年前
filters migrate module location 1 年前
github migrate module location 1 年前
handlers migrate module location 1 年前
logging initial 5 年前
posts migrate module location 1 年前
static migrate module location 1 年前
types initial 5 年前
utils migrate module location 1 年前
.gitignore initial 5 年前
Dockerfile initial 5 年前
Dockerfile-sqliteweb initial 5 年前
LICENSE add license 5 年前
Makefile initial 5 年前
README.md update 1 年前
api.go migrate module location 1 年前
commands.go migrate module location 1 年前
config.toml initial 5 年前
docker-compose.yml initial 5 年前
docker-entrypoint.sh initial 5 年前
feed_commands.go migrate module location 1 年前
go.mod migrate module location 1 年前
go.sum migrate module location 1 年前
hugobot.pdf update 1 年前
jobs.go migrate module location 1 年前
main.go migrate module location 1 年前
parse_test.go migrate module location 1 年前
posts_test.go migrate module location 1 年前
scheduler.go migrate module location 1 年前
server.go migrate module location 1 年前

README.md

MIRRORED FROM: https://git.blob42.xyz/blob42/hugobot

HUGOBOT

hugobot is a bot that automates the fetching and aggregation of content for Hugo data-driven websites. It has the following features:

Data fetch

  • Use the feeds table to register feeds that will be fetched periodically.
  • Currently, it can handle these types of feeds: RSS, Github Releases, Newsletters
  • To define your own feed types, implement the JobHandler interface (see handlers/handlers.go).
  • Hugobot automatically fetches new posts from the registered feeds.
  • The database uses Sqlite for storage. It has feeds and posts tables.
  • The scheduler can handle an unlimited number of tasks and uses leveldb for caching and resuming jobs.

Hugo export

  • Data is automatically exported to the configured Hugo website path.
  • It can export data as markdown files or json/toml data files.
  • You can customize all fields in the exported files.
  • You can define custom output formats by using the FormatHandler interface.
  • You can register custom filters and post-processing for exported posts to prevent altering the raw data stored in the database.
  • You can force data export using the CLI.

API

  • It uses gin-gonic as the web framework.
  • hugobot also includes a webserver API that can be used with Hugo Data Driven Mode.
  • You can insert and query data from the database. This feature is still a work in progress, but you can easily add the missing code on the API side to automate inserting and querying data from the database.
  • For example, it can be used to automate the generation of Bitcoin addresses for new articles on bitcointechweekly.com.

Other

  • Some commands are available through the CLI (github.com/urfave/cli), you can add your own custom commands.

Sqliteweb interface

  • See the Docker files for more information.

First time usage

  • The first time you run the program, it will automatically generate the database. You can add your feeds to the Sqlite database using your preferred Sqlite GUI.

Contribution

  • We welcome pull requests. Our current priority is adding tests.
  • Check the TODO section.

TODO:

  • Add tests.
  • Handle more feed formats: tweets, mailing-list emails ...
  • TLS support in the API (not a priority, can be done with a reverse proxy).