Compare commits

...

63 Commits

Author SHA1 Message Date
587dedf7a9 Update to snappy v0.8.0 2025-02-03 23:40:46 +01:00
fb9c1d9fca rename to github.com/ekzyis/hnbot 2025-02-03 23:40:23 +01:00
2cfb0eaa31 refactor into individual packages 2025-02-03 23:40:23 +01:00
7641414693 Update to snappy v0.7.0 2024-12-28 00:50:33 +01:00
73b2f6a430 Update to snappy v0.5.1rc1 2024-07-02 23:56:59 +02:00
cc5a05574a Fix unique constraint hit
For HN reposts, the same SN dupe was found but since sn_items.id was the primary key, a unique constraint was hit.

This meant that posting the same item was attempted over and over again since the HN item id was never found in sn_items.

I manually migrated the database.
2024-04-25 18:24:27 +02:00
0a0470ae67 Ignore hnbot.sqlite3 2024-04-09 02:06:18 +02:00
822ddda1da Update to snappy v0.4.2 2024-04-07 05:49:38 +02:00
3d35aa5e19 Update README.md 2024-04-07 05:45:32 +02:00
8e28ee4691 Remove .env.template 2024-04-07 05:45:25 +02:00
35a4112c22 Remove discord code 2024-04-07 05:42:05 +02:00
24909f5d88 Delete code related to charts 2024-04-07 05:38:57 +02:00
3bf5c8baba Update to snappy v0.4.1 2024-04-07 05:37:31 +02:00
ee95aa89bd Remove chart link and cron.sh
I think it was more annoying than interesting to look at.

Also, it didn't seem to be worth the maintenance of the cronjob etc.
2024-04-04 20:24:02 +02:00
ecda7954ef Query formatting
I first thought that ORDER BY time ASC was missing because I didn't see it.
2024-03-31 16:52:59 +02:00
7786a7f6da Only crosspost link if still rank 1 after 1 hour 2024-03-31 03:48:52 +02:00
ekzyis
1416550be9 First wait, then run 2024-03-18 08:26:04 +01:00
ekzyis
7bf9e6c53c Remove unnecessary goroutines 2024-03-18 08:22:51 +01:00
ekzyis
0285c5a53a Sync HN items every minute 2024-03-18 08:19:46 +01:00
ekzyis
a2633b1fdb Update to sn-goapi v0.3.3
This should also fix checksum mismatches for v0.3.1
2024-03-18 07:54:24 +01:00
ekzyis
fc0d9d0690 Fix plots not generated until after item is posted 2024-03-18 06:43:48 +00:00
ekzyis
b8188fbb2e Fix comment 2024-03-18 07:15:22 +01:00
ekzyis
d116ba00bd Fix no error handling for comments 2024-03-18 07:14:43 +01:00
ekzyis
a3933b48c6 Only run every 15 minutes
This should prevent spam fees.
2024-03-18 07:09:19 +01:00
ekzyis
638939edd6 Only post the oldest HN item per run 2024-03-18 07:07:13 +01:00
ekzyis
bf0387ffdb go mod tidy 2024-03-17 22:12:56 +00:00
ekzyis
00d3d759ac Add link to chart
* Added SQL file to export CSV
* Updated plot.py to close figures due to RuntimeWarning shown
* hnbot now includes link to chart in comment
* charts are generated and copied to files.ekzyis.com using a cronjob
2024-03-17 21:29:33 +00:00
ekzyis
c993978384 Fix checksum mismatch 2024-03-17 20:49:17 +00:00
ekzyis
bf95fba2dc Fix post error: 'title: must be at least 5 characters' 2024-03-17 21:26:59 +01:00
ekzyis
45d0376889 Add plot script 2024-03-14 04:09:38 +01:00
ekzyis
9d38176eab Save dupes to prevent retries 2024-03-14 03:00:43 +01:00
ekzyis
7e4744503f Store time series of HN data in SQLite3
This can be used in the future for better content curation and generating charts.

Currently, it still posts stories when they hit rank 1.
2024-03-13 13:41:04 +01:00
ekzyis
c7e368ed2e Fix slice out of bounds error
Fixes following error:

  panic: runtime error: slice bounds out of range [:80] with length 53
2023-08-31 09:56:53 +02:00
ekzyis
bd4c8fb4a9 Limit title to 80 chars 2023-08-30 16:16:53 +02:00
ekzyis
ef9b948f0e Upgrade to sn-goapi v0.3.1 2023-08-30 16:13:31 +02:00
ekzyis
e5688d8cc4 Fix invalid URL on Ask HN posts 2023-08-15 22:23:16 +02:00
ekzyis
88e2ff8b41 Add session keepAlive 2023-08-15 22:07:39 +02:00
ekzyis
7e892859c3 Better logging 2023-06-13 00:02:14 +02:00
ekzyis
1a02fe42c1 Use tech sub 2023-06-12 23:54:15 +02:00
ekzyis
c40d61a6a8 Stop sending dupes error to discord 2023-06-09 06:02:55 +02:00
ekzyis
819905740a Update README 2023-06-08 23:26:22 +02:00
ekzyis
e0b086c744 Upgrade to sn-goapi v0.1.1 2023-06-01 03:41:20 +02:00
ekzyis
9e057fca03 Use sn-goapi v0.1.0 2023-06-01 03:11:36 +02:00
ekzyis
aaa89408d1 Query hasNewNotes and forward 2023-06-01 01:58:08 +02:00
ekzyis
64799bfa10 Fix GraphQL error handling 2023-05-11 23:37:48 +02:00
ekzyis
6cb901728a Add new required field 'sub' 2023-05-11 22:57:52 +02:00
ekzyis
2a9b7c6737 Wait until next full hour 2023-04-27 00:55:39 +02:00
ekzyis
5388480f31 Overhaul logging and error handling 2023-04-25 12:01:49 +02:00
ekzyis
85fa5997dd Set static vars outside of init 2023-04-25 02:30:04 +02:00
ekzyis
e0866c8470 Fix duplicate user struct 2023-04-25 02:26:44 +02:00
ekzyis
12143f1296 Replace ItemID with int 2023-04-25 02:25:59 +02:00
ekzyis
26aa14c9a1 Merge branch '6-show-error-if-dupes-exist-and-add-option-to-override' into 'develop'
Resolve "Show error if dupes exist and add option to override"

Closes #6

See merge request ekzyis/hnbot!5
2023-04-25 00:25:22 +00:00
ekzyis
c2b6e77751 Skip dupes check on skip reaction 2023-04-25 02:22:27 +02:00
ekzyis
8eaaaeab3e Rename var DiscordClient to dg 2023-04-25 02:01:58 +02:00
ekzyis
b2b957e5c3 Replace webhook with discordgo 2023-04-25 02:00:48 +02:00
ekzyis
6010e47dde Show dupes in discord 2023-04-25 00:54:45 +02:00
ekzyis
75597bab8e Merge branch '2-post-curated-content' into 'develop'
Use discord to enable manual posting of HN links

Closes #2

See merge request ekzyis/hnbot!4
2023-04-19 23:19:42 +00:00
ekzyis
21872fe6ac Add discord bot to post manual HN links 2023-04-20 01:18:18 +02:00
ekzyis
bc121ce87b Use infinite loop 2023-04-20 00:20:46 +02:00
ekzyis
eac946690b go mod tidy 2023-04-20 00:20:32 +02:00
ekzyis
073924066b Merge branch '4-send-logs-to-discord-channel-for-monitoring-purposes' into 'develop'
Send SN posts as embeds to Discord

Closes #4

See merge request ekzyis/hnbot!3
2023-04-19 20:50:13 +00:00
ekzyis
adff033c6b Send SN posts as embeds to Discord 2023-04-19 22:48:24 +02:00
ekzyis
82a380de0e Remove unused HN code 2023-04-19 21:14:03 +02:00
12 changed files with 386 additions and 510 deletions

View File

@ -1,3 +0,0 @@
SN_AUTH_COOKIE=
SN_USERNAME=hn
HN_AUTH_COOKIE=

1
.gitignore vendored
View File

@ -3,3 +3,4 @@
# go executable
hnbot
hnbot.sqlite3

View File

@ -1,7 +1,11 @@
# hnbot
> Hello, I am a bot crossposting top posts from HN.
> Hello, I am a bot posting top stories from HN.
>
> I curate content to only post stuff which could be interesting for the SN community on a best-efforts basis.
> My original mission was to orange-pill HN by offering the OPs on HN to claim the sats their stories received here.
However, my comments were shadowbanned and ultimately not approved by dang, the site admin.
See this thread: [#164155](https://stacker.news/items/164155)
>
> If you are one of these OPs and want to claim your sats, reply to this bio and we will find a solution!
-- https://stacker.news/items/161788
-- https://stacker.news/hn

96
db/db.go Normal file
View File

@ -0,0 +1,96 @@
package db
import (
"database/sql"
"fmt"
"log"
"github.com/ekzyis/hnbot/hn"
_ "github.com/mattn/go-sqlite3"
)
var (
_db *sql.DB
)
func init() {
var err error
_db, err = sql.Open("sqlite3", "hnbot.sqlite3")
if err != nil {
log.Fatal(err)
}
migrate(_db)
}
func Query(query string, args ...interface{}) (*sql.Rows, error) {
return _db.Query(query, args...)
}
func migrate(db *sql.DB) {
if _, err := db.Exec(`
CREATE TABLE IF NOT EXISTS hn_items (
id INTEGER NOT NULL,
created_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
time TIMESTAMP WITH TIMEZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
title TEXT NOT NULL,
url TEXT,
author TEXT NOT NULL,
ndescendants INTEGER NOT NULL,
score INTEGER NOT NULL,
rank INTEGER NOT NULL,
PRIMARY KEY (id, created_at)
);
`); err != nil {
err = fmt.Errorf("error during migration: %w", err)
log.Fatal(err)
}
if _, err := db.Exec(`
CREATE TABLE IF NOT EXISTS sn_items (
id INTEGER NOT NULL,
created_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
hn_id INTEGER NOT NULL REFERENCES hn_items(id),
PRIMARY KEY (id, hn_id)
);
`); err != nil {
err = fmt.Errorf("error during migration: %w", err)
log.Fatal(err)
}
}
func ItemHasComment(parentId int) bool {
var count int
err := _db.QueryRow(`SELECT COUNT(1) FROM comments WHERE parent_id = ?`, parentId).Scan(&count)
if err != nil {
err = fmt.Errorf("error during item check: %w", err)
log.Fatal(err)
}
return count > 0
}
func SaveHnItems(story *[]hn.Item) error {
for i, s := range *story {
if err := SaveHnItem(&s, i+1); err != nil {
return err
}
}
return nil
}
func SaveHnItem(s *hn.Item, rank int) error {
if _, err := _db.Exec(`
INSERT INTO hn_items(id, time, title, url, author, ndescendants, score, rank)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
s.ID, s.Time, s.Title, s.Url, s.By, s.Descendants, s.Score, rank); err != nil {
err = fmt.Errorf("error during item insert: %w", err)
return err
}
return nil
}
func SaveSnItem(id int, hnId int) error {
if _, err := _db.Exec(`INSERT INTO sn_items(id, hn_id) VALUES (?, ?)`, id, hnId); err != nil {
err = fmt.Errorf("error during sn item insert: %w", err)
return err
}
return nil
}

15
go.mod
View File

@ -1,13 +1,12 @@
module gitlab.com/ekzyis/hnbot
module github.com/ekzyis/hnbot
go 1.20
require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/dustin/go-humanize v1.0.1 // indirect
github.com/joho/godotenv v1.5.1 // indirect
github.com/namsral/flag v1.7.4-pre // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/stretchr/testify v1.8.2 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
github.com/dustin/go-humanize v1.0.1
github.com/ekzyis/snappy v0.8.0
github.com/joho/godotenv v1.5.1
github.com/mattn/go-sqlite3 v1.14.22
)
require gopkg.in/guregu/null.v4 v4.0.0 // indirect

24
go.sum
View File

@ -1,22 +1,10 @@
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=
github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto=
github.com/ekzyis/snappy v0.8.0 h1:e7dRR384XJgNYa1FWNIZmqITSHOSanteBFXQJPfcQwg=
github.com/ekzyis/snappy v0.8.0/go.mod h1:UksYI0dU0+cnzz0LQjWB1P0QQP/ghx47e4atP99a5Lk=
github.com/joho/godotenv v1.5.1 h1:7eLL/+HRGLY0ldzfGMeQkb7vMd0as4CfYvUVzLqw0N0=
github.com/joho/godotenv v1.5.1/go.mod h1:f4LDr5Voq0i2e/R5DDNOoa2zzDfwtkZa6DnEwAbqwq4=
github.com/namsral/flag v1.7.4-pre h1:b2ScHhoCUkbsq0d2C15Mv+VU8bl8hAXV8arnWiOHNZs=
github.com/namsral/flag v1.7.4-pre/go.mod h1:OXldTctbM6SWH1K899kPZcf65KxJiD7MsceFUpB5yDo=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=
github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
github.com/stretchr/testify v1.8.2 h1:+h33VjcLVPDHtOdpUCuF+7gSuG3yGIftsP1YvFihtJ8=
github.com/stretchr/testify v1.8.2/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
github.com/mattn/go-sqlite3 v1.14.22 h1:2gZY6PC6kBnID23Tichd1K+Z0oS6nE/XwU+Vz/5o4kU=
github.com/mattn/go-sqlite3 v1.14.22/go.mod h1:Uh1q+B4BYcTPb+yiD3kU8Ct7aC0hY9fxUwlHK0RXw+Y=
gopkg.in/guregu/null.v4 v4.0.0 h1:1Wm3S1WEA2I26Kq+6vcW+w0gcDo44YKYD7YIEJNHDjg=
gopkg.in/guregu/null.v4 v4.0.0/go.mod h1:YoQhUrADuG3i9WqesrCmpNRwm1ypAgSHYqoOcTu/JrI=

174
hn.go
View File

@ -1,174 +0,0 @@
package main
import (
"encoding/json"
"fmt"
"io/ioutil"
"log"
"net/http"
"net/url"
"regexp"
"strconv"
"strings"
"github.com/joho/godotenv"
"github.com/namsral/flag"
)
type ItemID = int
type Story struct {
ID ItemID
By string // username of author
Time int // UNIX timestamp
Descendants int // number of comments
Kids []ItemID
Score int
Title string
Url string
}
var (
HackerNewsUrl string
HackerNewsFirebaseUrl string
HnAuthCookie string
)
func init() {
HackerNewsUrl = "https://news.ycombinator.com"
HackerNewsFirebaseUrl = "https://hacker-news.firebaseio.com/v0"
err := godotenv.Load()
if err != nil {
log.Fatal("Error loading .env file")
}
flag.StringVar(&HnAuthCookie, "HN_AUTH_COOKIE", "", "Cookie required for authorizing requests to news.ycombinator.com")
flag.Parse()
if HnAuthCookie == "" {
log.Fatal("HN_AUTH_COOKIE not set")
}
}
func FetchHackerNewsTopStories() []Story {
// API docs: https://github.com/HackerNews/API
url := fmt.Sprintf("%s/topstories.json", HackerNewsFirebaseUrl)
resp, err := http.Get(url)
if err != nil {
log.Fatal("Error fetching top stories:", err)
}
defer resp.Body.Close()
log.Printf("GET %s %d\n", url, resp.StatusCode)
var ids []int
err = json.NewDecoder(resp.Body).Decode(&ids)
if err != nil {
log.Fatal("Error decoding top stories JSON:", err)
}
// we are only interested in the first page of top stories
const limit = 30
ids = ids[:limit]
var stories [limit]Story
for i, id := range ids {
story := FetchStoryById(id)
stories[i] = story
}
// Can't return [30]Story as []Story so we copy the array
return stories[:]
}
func FetchStoryById(id ItemID) Story {
url := fmt.Sprintf("https://hacker-news.firebaseio.com/v0/item/%d.json", id)
resp, err := http.Get(url)
if err != nil {
log.Fatal("Error fetching story:", err)
}
defer resp.Body.Close()
log.Printf("GET %s %d\n", url, resp.StatusCode)
var story Story
err = json.NewDecoder(resp.Body).Decode(&story)
if err != nil {
log.Fatal("Error decoding story JSON:", err)
}
return story
}
func FetchHackerNewsItemHMAC(id ItemID) string {
hnUrl := fmt.Sprintf("%s/item?id=%d", HackerNewsUrl, id)
req, err := http.NewRequest("GET", hnUrl, nil)
if err != nil {
panic(err)
}
// Cookie header must be set to fetch the correct HMAC for posting comments
req.Header.Set("Cookie", HnAuthCookie)
client := http.DefaultClient
resp, err := client.Do(req)
if err != nil {
panic(err)
}
log.Printf("GET %s %d\n", hnUrl, resp.StatusCode)
body, err := ioutil.ReadAll(resp.Body)
if err != nil {
log.Fatal("Failed to read response body:", err)
}
// Find HMAC in body
re := regexp.MustCompile(`name="hmac" value="([a-z0-9]+)"`)
match := re.FindStringSubmatch(string(body))
if len(match) == 0 {
log.Fatal("No HMAC found")
}
hmac := match[1]
return hmac
}
func CommentHackerNewsStory(text string, id ItemID) {
hmac := FetchHackerNewsItemHMAC(id)
hnUrl := fmt.Sprintf("%s/comment", HackerNewsUrl)
data := url.Values{}
data.Set("parent", strconv.Itoa(id))
data.Set("goto", fmt.Sprintf("item?id=%d", id))
data.Set("text", text)
data.Set("hmac", hmac)
req, err := http.NewRequest("POST", hnUrl, strings.NewReader(data.Encode()))
if err != nil {
panic(err)
}
req.Header.Set("Content-Type", "application/x-www-form-urlencoded")
req.Header.Set("Cookie", HnAuthCookie)
client := http.DefaultClient
resp, err := client.Do(req)
if err != nil {
panic(err)
}
defer resp.Body.Close()
log.Printf("POST %s %d\n", hnUrl, resp.StatusCode)
}
func HackerNewsUserLink(user string) string {
return fmt.Sprintf("%s/user?id=%s", HackerNewsUrl, user)
}
func HackerNewsItemLink(id int) string {
return fmt.Sprintf("%s/item?id=%d", HackerNewsUrl, id)
}
func FindHackerNewsItemId(text string) int {
re := regexp.MustCompile(fmt.Sprintf(`\[HN\]\(%s/item\?id=([0-9]+)\)`, HackerNewsUrl))
match := re.FindStringSubmatch(text)
if len(match) == 0 {
log.Fatal("No Hacker News item URL found")
}
id, err := strconv.Atoi(match[1])
if err != nil {
panic(err)
}
return id
}

105
hn/hn.go Normal file
View File

@ -0,0 +1,105 @@
package hn
import (
"encoding/json"
"errors"
"fmt"
"log"
"net/http"
"regexp"
"strconv"
)
type Item struct {
ID int
By string // username of author
Time int // UNIX timestamp
Descendants int // number of comments
Kids []int
Score int
Title string
Url string
}
var (
hnUrl = "https://news.ycombinator.com"
hnFirebaseUrl = "https://hacker-news.firebaseio.com/v0"
hnLinkRegexp = regexp.MustCompile(`(?:https?:\/\/)?news\.ycombinator\.com\/item\?id=([0-9]+)`)
)
func FetchTopItems() ([]Item, error) {
log.Println("[hn] fetch top items ...")
// API docs: https://github.com/HackerNews/API
url := fmt.Sprintf("%s/topstories.json", hnFirebaseUrl)
resp, err := http.Get(url)
if err != nil {
return nil, fmt.Errorf("error fetching HN top stories %w:", err)
}
defer resp.Body.Close()
var ids []int
err = json.NewDecoder(resp.Body).Decode(&ids)
if err != nil {
return nil, fmt.Errorf("error decoding HN top stories JSON: %w", err)
}
// we are only interested in the first page of top stories
const limit = 30
ids = ids[:limit]
var stories [limit]Item
for i, id := range ids {
var item Item
err := FetchItemById(id, &item)
if err != nil {
return nil, err
}
stories[i] = item
}
log.Println("[hn] fetch top items ... OK")
// Can't return [30]Item as []Item so we copy the array
return stories[:], nil
}
func FetchItemById(id int, hnItem *Item) error {
// log.Printf("[hn] fetch HN item %d ...\n", id)
url := fmt.Sprintf("https://hacker-news.firebaseio.com/v0/item/%d.json", id)
resp, err := http.Get(url)
if err != nil {
err = fmt.Errorf("error fetching HN item %d: %w", id, err)
return err
}
defer resp.Body.Close()
err = json.NewDecoder(resp.Body).Decode(&hnItem)
if err != nil {
err := fmt.Errorf("error decoding JSON for HN item %d: %w", id, err)
return err
}
// log.Printf("[hn] fetch HN item %d ... OK\n", id)
return nil
}
func ParseLink(link string) (int, error) {
match := hnLinkRegexp.FindStringSubmatch(link)
if len(match) == 0 {
return -1, errors.New("not a hacker news link")
}
id, err := strconv.Atoi(match[1])
if err != nil {
return -1, errors.New("integer conversion to string failed")
}
return id, nil
}
func UserLink(user string) string {
return fmt.Sprintf("%s/user?id=%s", hnUrl, user)
}
func ItemLink(id int) string {
return fmt.Sprintf("%s/item?id=%d", hnUrl, id)
}

79
main.go
View File

@ -1,32 +1,77 @@
package main
import (
"errors"
"log"
"time"
"github.com/ekzyis/hnbot/db"
"github.com/ekzyis/hnbot/hn"
sn "github.com/ekzyis/hnbot/sn"
"github.com/joho/godotenv"
"github.com/namsral/flag"
)
var (
SnUserName string
)
func SyncHnItemsToDb() {
for {
now := time.Now()
dur := now.Truncate(time.Minute).Add(time.Minute).Sub(now)
log.Println("[hn] sleeping for", dur.Round(time.Second))
time.Sleep(dur)
func init() {
err := godotenv.Load()
if err != nil {
log.Fatal("Error loading .env file")
}
flag.StringVar(&SnUserName, "SN_USERNAME", "", "Username of bot on SN")
flag.Parse()
if SnUserName == "" {
log.Fatal("SN_USERNAME not set")
stories, err := hn.FetchTopItems()
if err != nil {
log.Println(err)
continue
}
if err := db.SaveHnItems(&stories); err != nil {
log.Println(err)
continue
}
}
}
func main() {
stories := FetchHackerNewsTopStories()
filtered := CurateContentForStackerNews(&stories)
for _, story := range *filtered {
PostStoryToStackerNews(&story)
if err := godotenv.Load(); err != nil {
log.Fatal(err)
}
// fetch HN front page every minute in the background and store state in db
go SyncHnItemsToDb()
// check every 15 minutes if there is now a HN item that is worth posting to SN
for {
var (
filtered *[]hn.Item
err error
)
now := time.Now()
dur := now.Truncate(time.Minute).Add(15 * time.Minute).Sub(now)
log.Println("[sn] sleeping for", dur.Round(time.Second))
time.Sleep(dur)
if filtered, err = sn.CurateContent(); err != nil {
log.Println(err)
continue
}
log.Printf("[sn] found %d item(s) to post\n", len(*filtered))
for _, item := range *filtered {
_, err := sn.Post(&item, sn.PostOptions{SkipDupes: false})
if err != nil {
var dupesErr *sn.DupesError
if errors.As(err, &dupesErr) {
log.Println(dupesErr)
parentId := dupesErr.Dupes[0].Id
if err := db.SaveSnItem(parentId, item.ID); err != nil {
log.Println(err)
}
continue
}
log.Println(err)
continue
}
}
}
}

273
sn.go
View File

@ -1,273 +0,0 @@
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"time"
"github.com/dustin/go-humanize"
"github.com/joho/godotenv"
"github.com/namsral/flag"
)
type GraphQLPayload struct {
Query string `json:"query"`
Variables map[string]interface{} `json:"variables,omitempty"`
}
type Dupe struct {
Id int `json:"id,string"`
Url string `json:"url"`
Title string `json:"title"`
}
type DupesResponse struct {
Data struct {
Dupes []Dupe `json:"dupes"`
} `json:"data"`
}
type User struct {
Name string `json:"name"`
}
type Comment struct {
Id int `json:"id,string"`
Text string `json:"text"`
User User `json:"user"`
Comments []Comment `json:"comments"`
}
type Item struct {
Id int `json:"id,string"`
Title string `json:"title"`
Url string `json:"url"`
Sats int `json:"sats"`
CreatedAt time.Time `json:"createdAt"`
Comments []Comment `json:"comments"`
NComments int `json:"ncomments"`
}
type UpsertLinkResponse struct {
Data struct {
UpsertLink Item `json:"upsertLink"`
} `json:"data"`
}
type ItemsResponse struct {
Data struct {
Items struct {
Items []Item `json:"items"`
Cursor string `json:"cursor"`
} `json:"items"`
} `json:"data"`
}
var (
StackerNewsUrl string
SnApiUrl string
SnAuthCookie string
)
func init() {
StackerNewsUrl = "https://stacker.news"
SnApiUrl = "https://stacker.news/api/graphql"
err := godotenv.Load()
if err != nil {
log.Fatal("Error loading .env file")
}
flag.StringVar(&SnAuthCookie, "SN_AUTH_COOKIE", "", "Cookie required for authorizing requests to stacker.news/api/graphql")
flag.Parse()
if SnAuthCookie == "" {
log.Fatal("SN_AUTH_COOKIE not set")
}
}
func MakeStackerNewsRequest(body GraphQLPayload) *http.Response {
bodyJSON, err := json.Marshal(body)
if err != nil {
log.Fatal("Error during json.Marshal:", err)
}
req, err := http.NewRequest("POST", SnApiUrl, bytes.NewBuffer(bodyJSON))
if err != nil {
panic(err)
}
req.Header.Set("Content-Type", "application/json")
req.Header.Set("Cookie", SnAuthCookie)
client := http.DefaultClient
resp, err := client.Do(req)
if err != nil {
panic(err)
}
log.Printf("POST %s %d\n", SnApiUrl, resp.StatusCode)
return resp
}
func CurateContentForStackerNews(stories *[]Story) *[]Story {
// TODO: filter by relevance
slice := (*stories)[0:1]
return &slice
}
func FetchStackerNewsDupes(url string) *[]Dupe {
body := GraphQLPayload{
Query: `
query Dupes($url: String!) {
dupes(url: $url) {
id
url
title
}
}`,
Variables: map[string]interface{}{
"url": url,
},
}
resp := MakeStackerNewsRequest(body)
defer resp.Body.Close()
var dupesResp DupesResponse
err := json.NewDecoder(resp.Body).Decode(&dupesResp)
if err != nil {
log.Fatal("Error decoding dupes JSON:", err)
}
return &dupesResp.Data.Dupes
}
func PostStoryToStackerNews(story *Story) {
dupes := FetchStackerNewsDupes(story.Url)
if len(*dupes) > 0 {
log.Printf("%s was already posted. Skipping.\n", story.Url)
return
}
body := GraphQLPayload{
Query: `
mutation upsertLink($url: String!, $title: String!) {
upsertLink(url: $url, title: $title) {
id
}
}`,
Variables: map[string]interface{}{
"url": story.Url,
"title": story.Title,
},
}
resp := MakeStackerNewsRequest(body)
defer resp.Body.Close()
var upsertLinkResp UpsertLinkResponse
err := json.NewDecoder(resp.Body).Decode(&upsertLinkResp)
if err != nil {
log.Fatal("Error decoding dupes JSON:", err)
}
parentId := upsertLinkResp.Data.UpsertLink.Id
log.Println("Created new post on SN")
log.Printf("id=%d title='%s' url=%s\n", parentId, story.Title, story.Url)
comment := fmt.Sprintf(
"This link was posted by [%s](%s) %s on [HN](%s). It received %d points and %d comments.",
story.By,
HackerNewsUserLink(story.By),
humanize.Time(time.Unix(int64(story.Time), 0)),
HackerNewsItemLink(story.ID),
story.Score, story.Descendants,
)
CommentStackerNewsPost(comment, parentId)
}
func CommentStackerNewsPost(text string, parentId int) {
body := GraphQLPayload{
Query: `
mutation createComment($text: String!, $parentId: ID!) {
createComment(text: $text, parentId: $parentId) {
id
}
}`,
Variables: map[string]interface{}{
"text": text,
"parentId": parentId,
},
}
resp := MakeStackerNewsRequest(body)
defer resp.Body.Close()
log.Println("Commented post on SN")
log.Printf("text='%s' parentId=%d\n", text, parentId)
}
func FetchStackerNewsUserItems(user string) *[]Item {
query := `
query items($name: String!, $cursor: String) {
items(name: $name, sort: "user", cursor: $cursor) {
items {
id
title
url
sats
createdAt
comments {
id
text
user {
name
}
comments {
id
text
user {
name
}
}
}
ncomments
}
cursor
}
}
`
var items []Item
var cursor string
for {
body := GraphQLPayload{
Query: query,
Variables: map[string]interface{}{
"name": user,
"cursor": cursor,
},
}
resp := MakeStackerNewsRequest(body)
defer resp.Body.Close()
var itemsResp ItemsResponse
err := json.NewDecoder(resp.Body).Decode(&itemsResp)
if err != nil {
log.Fatal("Error decoding items JSON:", err)
}
fetchedItems := itemsResp.Data.Items.Items
for _, item := range fetchedItems {
items = append(items, item)
}
if len(fetchedItems) < 21 {
break
}
cursor = itemsResp.Data.Items.Cursor
}
log.Printf("Fetched %d items\n", len(items))
return &items
}
func StackerNewsItemLink(id int) string {
return fmt.Sprintf("%s/items/%d", StackerNewsUrl, id)
}

102
sn/sn.go Normal file
View File

@ -0,0 +1,102 @@
package sn
import (
"database/sql"
"fmt"
"log"
"time"
"github.com/dustin/go-humanize"
"github.com/ekzyis/hnbot/db"
"github.com/ekzyis/hnbot/hn"
sn "github.com/ekzyis/snappy"
)
type DupesError = sn.DupesError
func CurateContent() (*[]hn.Item, error) {
var (
rows *sql.Rows
err error
)
if rows, err = db.Query(`
SELECT t.id, time, title, url, author, score, ndescendants
FROM (
SELECT id, MIN(created_at) AS start, MAX(created_at) AS end
FROM hn_items
WHERE rank = 1 AND id NOT IN (SELECT hn_id FROM sn_items) AND length(title) >= 5
GROUP BY id
HAVING unixepoch(end) - unixepoch(start) >= 3600
ORDER BY time ASC
LIMIT 1
) t JOIN hn_items ON t.id = hn_items.id AND t.end = hn_items.created_at;
`); err != nil {
err = fmt.Errorf("error querying hn_items: %w", err)
return nil, err
}
defer rows.Close()
var items []hn.Item
for rows.Next() {
var item hn.Item
if err = rows.Scan(&item.ID, &item.Time, &item.Title, &item.Url, &item.By, &item.Score, &item.Descendants); err != nil {
err = fmt.Errorf("error scanning hn_items: %w", err)
return nil, err
}
items = append(items, item)
}
if err = rows.Err(); err != nil {
err = fmt.Errorf("error iterating hn_items: %w", err)
return nil, err
}
return &items, nil
}
type PostOptions struct {
SkipDupes bool
}
func Post(item *hn.Item, options PostOptions) (int, error) {
c := sn.NewClient()
url := item.Url
if url == "" {
url = hn.ItemLink(item.ID)
}
log.Printf("post to SN: %s ...\n", url)
if !options.SkipDupes {
dupes, err := c.Dupes(url)
if err != nil {
return -1, err
}
if len(*dupes) > 0 {
return -1, &sn.DupesError{Url: url, Dupes: *dupes}
}
}
title := item.Title
if len(title) > 80 {
title = title[0:80]
}
comment := fmt.Sprintf(
"This link was posted by [%s](%s) %s on [HN](%s). It received %d points and %d comments.",
item.By,
hn.UserLink(item.By),
humanize.Time(time.Unix(int64(item.Time), 0)),
hn.ItemLink(item.ID),
item.Score, item.Descendants,
)
parentId, err := c.PostLink(url, title, comment, "tech")
if err != nil {
return -1, fmt.Errorf("error posting link: %w", err)
}
log.Printf("post to SN: %s ... OK \n", url)
if err := db.SaveSnItem(parentId, item.ID); err != nil {
return -1, err
}
return parentId, nil
}

View File

@ -1,14 +0,0 @@
package main
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestFetchDupes(t *testing.T) {
// TODO: mock HTTP request
url := "https://en.wikipedia.org/wiki/Dishwasher_salmon"
dupes := FetchStackerNewsDupes(url)
assert.NotEmpty(t, *dupes, "Expected at least one duplicate")
}