Content fetch and aggregation bot for hugo data-driven websites
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Chakib Benziane 634780e6c9 update readme 1 week ago
bitcoin initial 1 week ago
config initial 1 week ago
db initial 1 week ago
encoder initial 1 week ago
export initial 1 week ago
feeds initial 1 week ago
filters initial 1 week ago
github initial 1 week ago
handlers initial 1 week ago
logging initial 1 week ago
posts initial 1 week ago
static initial 1 week ago
types initial 1 week ago
utils initial 1 week ago
.gitignore initial 1 week ago
Dockerfile initial 1 week ago
Dockerfile-sqliteweb initial 1 week ago
LICENSE add license 1 week ago
Makefile initial 1 week ago
README.md update readme 1 week ago
api.go initial 1 week ago
commands.go initial 1 week ago
config.toml initial 1 week ago
docker-compose.yml initial 1 week ago
docker-entrypoint.sh initial 1 week ago
feed_commands.go initial 1 week ago
go.mod initial 1 week ago
go.sum initial 1 week ago
jobs.go initial 1 week ago
main.go initial 1 week ago
parse_test.go initial 1 week ago
posts_test.go initial 1 week ago
scheduler.go initial 1 week ago
server.go initial 1 week ago

README.md

MIRRORED FROM: https://git.sp4ke.com/sp4ke/hugobot

HUGOBOT

hugobot is a an automated content fetch and aggregation bot for Hugo data driven websites. It has the following features:

Data fetch

  • Use the feeds table to register feeds that will periodically fetched stored and exported into the hugo project.
  • Currently handles these types of feeds: RSS, Github Releases, Newsletters
  • Define your own feed types by implementing the JobHandler interface (see handlers/handlers.go).
  • Hugobot automatically fetch new posts from the feeds you defined
  • It runs periodically to download new posts in the defined feeds.
  • Storage is done with sqlite.
  • The scheduler can handle any number of tasks and uses leveldb for caching/resuming jobs.

Hugo export

  • Data is automatically exported to the configured Hugo website path.
  • It can export markdwon files or json/toml data files
  • All fields in the exported files can be customized
  • You can define custom output formats by using the FormatHandler interface.
  • You can register custom filters and post processing on exports to avoid changing the raw data stored in the db.
  • You can force export of content through the CLI

API

  • Uses gin-gonic.

  • hugobot also includes a webserver API that can be used with Hugo Data Driven Mode.

  • Insert and query data from the db. This is still a WIP, you can easily add the missing code on the API side to automate adding/querying data from the DB.

  • An example usage is the automated generation of Bitcoin addresses for new articles on bitcointechweekly.com

Other

  • Some commands are available through the CLI, you can add your own custom commands.

Sqliteweb interface

  • See Docker files

First time usage

  • The database is automatically generated the first time you run the program. You can add your feeds straight into the sqlite db using your favorite sqlite GUI or the provided web gui in the docker-compose file.