Compare commits

..

113 Commits

Author SHA1 Message Date
ekzyis
1b6de0bb96 Return null if no wallet was found 2024-07-19 08:20:00 -05:00
keyan
a0c1d4f602 make lnc work 2024-07-18 18:56:49 -05:00
ekzyis
5d03e08514 Remove validate.schema as a trap door 2024-07-17 03:32:00 +02:00
ekzyis
6a5713034b Make clear that message belongs to test
* validate.message was used in tandem with validate.test
* it might be confused as the message if the validation for validate.type failed
* now validate.test can be a function or an object of { test, message } shape which matches Yup.test
2024-07-17 03:31:16 +02:00
ekzyis
c8d91bf42d Generate validation schema for CLN 2024-07-17 03:31:16 +02:00
ekzyis
08a5ce1a28 Remove stringTypes 2024-07-17 03:31:16 +02:00
ekzyis
4df0b460c3 Generate validation schema for LnAddr 2024-07-17 02:48:46 +02:00
ekzyis
587bfa34be Generate validation schema for LND 2024-07-17 02:38:04 +02:00
ekzyis
3933a4f460 Generate validation schema for LNC 2024-07-17 01:25:53 +02:00
ekzyis
667cde6042 Rename to torAllowed 2024-07-17 01:03:45 +02:00
ekzyis
6432ea7b44 Generate validation schema for NWC 2024-07-17 00:58:43 +02:00
ekzyis
fb2b34ce67 Generate validation schema for LNbits 2024-07-17 00:58:43 +02:00
ekzyis
9587ff9a52 Fix autowithdrawal error log 2024-07-16 22:55:04 +02:00
ekzyis
538f1e21d6 Fix id access in walletPrioritySort 2024-07-16 22:46:15 +02:00
ekzyis
e25a3dbec0 Fix w.default usage
Since package.json with { "type": "module" } was added, this is no longer needed.
2024-07-16 22:39:24 +02:00
Keyan
128f1f93b8
Merge branch 'master' into wallet-interface 2024-07-16 15:24:02 -05:00
ekzyis
b777fdcddc Fix wallet.server usage
* I removed wallet.server in a previous commit
* the client couldn't determine which wallet was stored on the server since all server specific fields were set in server.js
* walletType and walletField are now set in index.js
* walletType is now used to determine if a wallet is stored on the server

* also included some formatting changes
2024-07-16 22:08:41 +02:00
ekzyis
bbcfc2fada Fix worker import of wallets/server 2024-07-16 17:44:21 +02:00
ekzyis
5b2e835722 Separate client and server imports by files
* wallets now consist of an index.js, a client.js and a server.js file
* client.js is imported on the client and contains the client portion
* server.js is imported on the server and contains the server porition
* both reexport index.js so everything in index.js can be shared by client and server

* every wallet contains a client.js file since they are all imported on the client to show the cards

* client.js of every wallet is reexported as an array in wallets/client.js
* server.js of every wallet is reexported as an array in wallets/server.js

FIXME: for some reason, worker does not properly import the default export of wallets/server.js
2024-07-16 15:46:44 +02:00
ekzyis
259ebef971 Fix generateMutation
* remove resolverName property from wallet defs
* move function into lib/wallet
* use function in generateMutation on client to fix wrongly generated mutation
2024-07-16 14:18:57 +02:00
ekzyis
7851366cd5 Put wallets into own folder 2024-07-16 07:54:27 +02:00
ekzyis
cba76444dd Move wallets into top level directory wallet/ 2024-07-16 06:09:27 +02:00
ekzyis
f01ce79afa Generate resolver name from walletField 2024-07-16 04:08:13 +02:00
ekzyis
03ca84629b Remove React dependency from wallet definitions 2024-07-15 16:23:24 +02:00
ekzyis
7749c14d3b Remove 'tor or clearnet' hint for LN addresses 2024-07-15 14:46:48 +02:00
ekzyis
ee1574cf45 Fix leaking relay connections 2024-07-15 13:56:21 +02:00
ekzyis
6ac675429c Merge branch 'master' into wallet-interface 2024-07-15 13:24:38 +02:00
keyan
c767e106a0 merge master 2024-07-12 18:24:31 -05:00
ekzyis
6e6af40eb9 Toast priority save errors 2024-07-08 13:20:03 +02:00
ekzyis
05c0f8a66e Remove console.log 2024-07-08 13:14:30 +02:00
ekzyis
80756f23a4 Remove TODOs
TODO in components/wallet-logger.js was handled.
I don't see a need for the TODO in lib/wallet.js anymore. This function will only be called with the wallet of type LIGHTNING_ADDRESS anyway.
2024-07-08 13:04:03 +02:00
ekzyis
24bdf0a099 Add example wallet def 2024-07-08 12:58:58 +02:00
ekzyis
d9205b6d30 Add link to lnbits.com 2024-07-08 12:56:43 +02:00
ekzyis
7402885998 Use common sort 2024-07-08 11:34:05 +02:00
ekzyis
1a60f13d72 Fix order if wallet with no priority exists 2024-07-08 11:06:46 +02:00
ekzyis
920478a72c Update LNC code
* remove LNC FIXMEs

Mhh, I guess the TURN server was down or something? It now magically works. Or maybe it only works once per mnemonic?

* also removed the lnc.lnd.lightning.getInfo() call since we don't ask and need permission for this RPC for payments.

* setting a password does not work though. It fails with 'The password provided is not valid' which is triggered at https://github.com/lightninglabs/lnc-web/blob/main/lib/util/credentialStore.ts#L81.
2024-07-08 10:59:04 +02:00
ekzyis
9af8e63355 Fix error per invalid bip39 word 2024-07-08 08:26:51 +02:00
ekzyis
8a36bffb85 Fix autowithdraw priority order 2024-07-08 08:07:14 +02:00
ekzyis
8ea4d0c8a7 Fix duplicate CLN error 2024-07-08 07:59:28 +02:00
ekzyis
2051dd0e88 Use touches instead of dnd on mobile
Browsers don't support drag events for touch devices.

To have a consistent implementation for desktop and mobile, we would need to use mousedown/touchstart, mouseup/touchend and mousemove/touchmove.

For now, this commit makes changing the order possible on touch devices with simple touches.
2024-07-08 07:33:10 +02:00
ekzyis
5d678ced23 Fix draggable false on first page load due to SSR 2024-07-08 06:54:27 +02:00
ekzyis
459478036f Fix priority ignored when fetching enabled wallet 2024-07-08 05:49:54 +02:00
ekzyis
a69bca0f05 Use inject function for resolvers and typeDefs 2024-07-07 20:04:33 +02:00
ekzyis
85cfda330b Remove Wallet in lib/constants 2024-07-07 18:35:57 +02:00
ekzyis
85464f93b9 Detach wallets and delete logs on logout 2024-07-07 18:35:57 +02:00
ekzyis
dddbb53792 Add CLN autowithdrawal 2024-07-07 18:35:57 +02:00
ekzyis
ebe741dc92 Add missing hints 2024-07-07 18:35:57 +02:00
ekzyis
6bee659f2f Fix autowithdraw loop 2024-07-07 18:35:57 +02:00
ekzyis
bd0e4d906c Fix draggable 2024-07-07 18:35:57 +02:00
ekzyis
7528e5c2b6 Add optional wallet short name for logging 2024-07-07 18:35:57 +02:00
ekzyis
1ce09051b1 Add autowithdrawal to lightning address 2024-07-07 18:35:56 +02:00
ekzyis
8dac53d7d5 Fix wallet security banner shown for server wallets 2024-07-07 18:31:41 +02:00
ekzyis
cd074a47b7 Fix success autowithdrawal log 2024-07-07 18:31:41 +02:00
ekzyis
12bedae01a Use wallet.createInvoice for autowithdrawals 2024-07-07 18:31:41 +02:00
ekzyis
b569c8faa0 Fix import inconsistency between app and worker 2024-07-07 18:31:41 +02:00
ekzyis
ba00c3d9fa Generate wallet resolver from fields 2024-07-07 18:31:41 +02:00
ekzyis
00f78daadc Generate wallet mutation from fields 2024-07-07 18:31:41 +02:00
ekzyis
0a0085fe82 Remove unnecessary WALLETS_QUERY 2024-07-07 18:31:41 +02:00
ekzyis
48ead97615 Run lnbits url.replace in validate and sendPayment 2024-07-07 18:31:41 +02:00
ekzyis
6463e6eec8 Split arguments into [value,] config, context 2024-07-07 18:31:41 +02:00
ekzyis
0ebe097a70 Fix noisy changes in lib/validate
I moved the schema for lnbits, nwc and lnc out of lib/validate only to put them back in there later.

This commit should make the changeset cleaner by removing noise.
2024-07-07 18:31:41 +02:00
ekzyis
850c534c91 Fix typo 2024-07-07 18:31:41 +02:00
ekzyis
83fd39b035 Fix onCanceled missing 2024-07-07 18:31:41 +02:00
ekzyis
9bbf2056e9 Save dedicated enabled flag for server wallets
* wallet table now contains boolean column 'enabled'
* 'priority' is now a number everywhere
* use consistent order between how autowithdrawals are attempted and server wallets cards
2024-07-07 18:31:41 +02:00
ekzyis
8acf74c787 Fix autowithdrawSettings not applied
Form requires config in flat format but mutation requires autowithdraw settings in a separate 'settings' field.

I have decided that config will be in flat form format. It will be transformed into mutation format during save.
2024-07-07 18:31:41 +02:00
ekzyis
55928ac252 Save order as priority 2024-07-07 18:31:41 +02:00
ekzyis
c270805649 Use dynamic import for WalletCard
This fixes a lot of issues with hydration
2024-07-07 18:31:41 +02:00
ekzyis
eb2f4b980f Implement drag & drop w/o persistence 2024-07-07 18:31:41 +02:00
ekzyis
b96757b366 Move all validation schema into lib/validate 2024-07-07 18:31:41 +02:00
ekzyis
39d8928772 Disable checkbox if not configured yet 2024-07-07 18:31:41 +02:00
ekzyis
da6d262e0a Also enable server wallets on create 2024-07-07 18:31:41 +02:00
ekzyis
d20e258649 Consistent logs between local and server wallets
* 'wallet attached' on create
* 'wallet updated' on config updates
* 'wallet enabled' and 'wallet disabled' if checkbox changed
* 'wallet detached' on delete
2024-07-07 18:31:41 +02:00
ekzyis
d60e26bfdf Fix wallet logs not updated after server delete 2024-07-07 18:31:41 +02:00
ekzyis
9509833b88 Also use 'enabled' for server wallets 2024-07-07 18:31:41 +02:00
ekzyis
645ff78365 Fix server config not updated after save or detach 2024-07-07 18:31:41 +02:00
ekzyis
c18263dc73 Fix another hydration error 2024-07-07 18:31:41 +02:00
ekzyis
d8e82ddea5 Only include local/server config if required 2024-07-07 18:31:41 +02:00
ekzyis
e091377d94 Fix TypeError in isConfigured if no enabled wallet found 2024-07-07 18:31:41 +02:00
ekzyis
5b561e22a9 Fix wallet logs refetch
onError does not exist on client.mutate
2024-07-07 18:31:41 +02:00
ekzyis
4bf9954c4e Fix delete wallet logs on server 2024-07-07 18:31:41 +02:00
ekzyis
3b0605a691 Fix isConfigured 2024-07-07 18:31:41 +02:00
ekzyis
1f98a1a891 Fix usage of conditional hooks in useConfig 2024-07-07 18:31:41 +02:00
ekzyis
377ac04c85 Use same error format in toast and wallet log 2024-07-07 18:31:41 +02:00
ekzyis
9228328d3b Remove FIXMEs
Rebase on master seemed to have fixed these, weird
2024-07-07 18:31:41 +02:00
ekzyis
2aa0c9bc99 Fix confusing UX around enabled 2024-07-07 18:31:41 +02:00
ekzyis
d7c81cfa9f Fix sendPayment called with empty config
* removed useEffect such that config is available on first render
* fix hydration error using dynamic import without SSR
2024-07-07 18:31:41 +02:00
ekzyis
4a16cc17aa Fix TypeError 2024-07-07 18:31:41 +02:00
ekzyis
4082a45618 wip: Add LND autowithdrawals
* receiving wallets need to export 'server' object field
* don't print macaroon error stack
* fix missing wallet logs order update
* mark autowithdrawl settings as required
* fix server wallet logs deletion
* remove canPay and canReceive since it was confusing where it is available

TODO

* also use numeric priority for sending wallets to be consistent with how status for receiving wallets is determined
* define createInvoice function in wallet definition
* consistent wallet logs: sending wallets use 'wallet attached'+'wallet enabled/disabled' whereas receiving wallets use 'wallet created/updated'
* see FIXMEs
2024-07-07 18:31:41 +02:00
ekzyis
ae0335d537 Don't require destructuring to pass props to input 2024-07-07 18:31:41 +02:00
ekzyis
91978171ed Remove logger.error since already handled in useWallet 2024-07-07 18:31:41 +02:00
ekzyis
dae69ec4b3 Add FIXMEs for LNC
I can't get LNC to connect. It just hangs forever on lnc.connect(). See FIXMEs.
2024-07-07 18:31:40 +02:00
ekzyis
eda7fd6b46 Fix position of log start marker 2024-07-07 18:31:40 +02:00
ekzyis
fd08356d37 Remove follow and show recent logs first 2024-07-07 18:31:40 +02:00
ekzyis
61be80446d Revert "Fix 20s page load for /settings/wallets.json?nodata=true"
This reverts commit deb476b3a966569fefcfdf4082d6b64f90fbd0a2.

Not using the dynamic import for LNC fixed the slow page load with ?nodata=true.
2024-07-07 18:31:40 +02:00
ekzyis
6059e8f691 Use normal imports 2024-07-07 18:31:40 +02:00
ekzyis
1bae891594 Fix extremely slow page load for LNC import
I noticed that the combination of

```
import { Form, PasswordInput, SubmitButton } from '@/components/form'
```

in components/wallet/lnc.js and the dynamic import via `await import` in components/wallet/index.js caused extremely slow page loads.
2024-07-07 18:31:40 +02:00
ekzyis
276e734a7a Fix 20s page load for /settings/wallets.json?nodata=true
For some reason, if nodata is passed (which is the case if going back), the page takes 20s to load.
2024-07-07 18:31:40 +02:00
ekzyis
7b6602e386 wip: Add LNC 2024-07-07 18:31:40 +02:00
ekzyis
8e2dd45e23 Support help, optional, hint in wallet fields 2024-07-07 18:31:40 +02:00
ekzyis
7639390a16 Pass config with spread operator 2024-07-07 18:31:40 +02:00
ekzyis
29646eb956 Use INFO level for 'wallet disabled' message 2024-07-07 18:31:40 +02:00
ekzyis
dd47f2c02b Run validation during save 2024-07-07 18:31:40 +02:00
ekzyis
a5ea53dc39 Fix enableWallet
* wrong storage key was used
* broke if wallets with no configs existed
2024-07-07 18:31:40 +02:00
ekzyis
399c62a7e3 Fix unused isDefault saved in config 2024-07-07 18:31:40 +02:00
ekzyis
034cb4e8b2 Add NWC wallet 2024-07-07 18:31:40 +02:00
ekzyis
b8b0a4f985 Add schema to wallet def 2024-07-07 18:31:40 +02:00
ekzyis
0957cb5b83 Add logging to attach & detach 2024-07-07 18:31:40 +02:00
ekzyis
71c753810c Don't pass logger to sendPayment 2024-07-07 18:31:40 +02:00
ekzyis
0de82db78a Enable wallet if just configured 2024-07-07 18:31:40 +02:00
ekzyis
1a2be99027 Set canPay, canReceive in useWallet 2024-07-07 18:31:40 +02:00
ekzyis
6ac8785c51 Update wallet logging + other stuff
* add canPay and canSend to wallet definition
* rename 'default payment method' to 'enabled' and add enable + disable method
2024-07-07 18:31:40 +02:00
ekzyis
a1b343ac89 Fix import error 2024-07-07 18:31:40 +02:00
ekzyis
5f047cbfc9 wip: Use uniform interface for wallets 2024-07-07 18:31:40 +02:00
504 changed files with 11501 additions and 29048 deletions

View File

@ -1,7 +1,5 @@
PRISMA_SLOW_LOGS_MS= PRISMA_SLOW_LOGS_MS=
GRAPHQL_SLOW_LOGS_MS= GRAPHQL_SLOW_LOGS_MS=
NODE_ENV=development
COMPOSE_PROFILES='minimal,images,search,payments,wallets,email,capture'
############################################################################ ############################################################################
# OPTIONAL SECRETS # # OPTIONAL SECRETS #
@ -29,8 +27,8 @@ SLACK_BOT_TOKEN=
SLACK_CHANNEL_ID= SLACK_CHANNEL_ID=
# lnurl ... you'll need a tunnel to localhost:3000 for these # lnurl ... you'll need a tunnel to localhost:3000 for these
LNAUTH_URL=http://localhost:3000/api/lnauth LNAUTH_URL=
LNWITH_URL=http://localhost:3000/api/lnwith LNWITH_URL=
######################################## ########################################
# SNDEV STUFF WE PRESET # # SNDEV STUFF WE PRESET #
@ -78,7 +76,6 @@ IMGPROXY_MAX_ANIMATION_FRAME_RESOLUTION=200
IMGPROXY_READ_TIMEOUT=10 IMGPROXY_READ_TIMEOUT=10
IMGPROXY_WRITE_TIMEOUT=10 IMGPROXY_WRITE_TIMEOUT=10
IMGPROXY_DOWNLOAD_TIMEOUT=9 IMGPROXY_DOWNLOAD_TIMEOUT=9
IMGPROXY_ENABLE_VIDEO_THUMBNAILS=1
# IMGPROXY_DEVELOPMENT_ERRORS_MODE=1 # IMGPROXY_DEVELOPMENT_ERRORS_MODE=1
# IMGPROXY_ENABLE_DEBUG_HEADERS=true # IMGPROXY_ENABLE_DEBUG_HEADERS=true
@ -116,6 +113,8 @@ POSTGRES_DB=stackernews
# opensearch container stuff # opensearch container stuff
OPENSEARCH_INITIAL_ADMIN_PASSWORD=mVchg1T5oA9wudUh OPENSEARCH_INITIAL_ADMIN_PASSWORD=mVchg1T5oA9wudUh
plugins.security.disabled=true
discovery.type=single-node
DISABLE_SECURITY_DASHBOARDS_PLUGIN=true DISABLE_SECURITY_DASHBOARDS_PLUGIN=true
# bitcoind container stuff # bitcoind container stuff
@ -126,42 +125,27 @@ RPC_PORT=18443
P2P_PORT=18444 P2P_PORT=18444
ZMQ_BLOCK_PORT=28334 ZMQ_BLOCK_PORT=28334
ZMQ_TX_PORT=28335 ZMQ_TX_PORT=28335
ZMQ_HASHBLOCK_PORT=29000
# sn_lnd container stuff # sn lnd container stuff
SN_LND_REST_PORT=8080 LND_REST_PORT=8080
SN_LND_GRPC_PORT=10009 LND_GRPC_PORT=10009
SN_LND_P2P_PORT=9735 LND_P2P_PORT=9735
# docker exec -u lnd sn_lnd lncli newaddress p2wkh --unused # docker exec -u lnd sn_lnd lncli newaddress p2wkh --unused
SN_LND_ADDR=bcrt1q7q06n5st4vqq3lssn0rtkrn2qqypghv9xg2xnl LND_ADDR=bcrt1q7q06n5st4vqq3lssn0rtkrn2qqypghv9xg2xnl
SN_LND_PUBKEY=02cb2e2d5a6c5b17fa67b1a883e2973c82e328fb9bd08b2b156a9e23820c87a490 LND_PUBKEY=02cb2e2d5a6c5b17fa67b1a883e2973c82e328fb9bd08b2b156a9e23820c87a490
# sn_lndk stuff
SN_LNDK_GRPC_PORT=10012
# lnd container stuff # stacker lnd container stuff
LND_REST_PORT=8081 STACKER_LND_REST_PORT=8081
LND_GRPC_PORT=10010 STACKER_LND_GRPC_PORT=10010
# docker exec -u lnd lnd lncli newaddress p2wkh --unused # docker exec -u lnd stacker_lnd lncli newaddress p2wkh --unused
LND_ADDR=bcrt1qfqau4ug9e6rtrvxrgclg58e0r93wshucumm9vu STACKER_LND_ADDR=bcrt1qfqau4ug9e6rtrvxrgclg58e0r93wshucumm9vu
LND_PUBKEY=028093ae52e011d45b3e67f2e0f2cb6c3a1d7f88d2920d408f3ac6db3a56dc4b35 STACKER_LND_PUBKEY=028093ae52e011d45b3e67f2e0f2cb6c3a1d7f88d2920d408f3ac6db3a56dc4b35
# cln container stuff # stacker cln container stuff
CLN_REST_PORT=9092 STACKER_CLN_REST_PORT=9092
# docker exec -u clightning cln lightning-cli newaddr bech32 # docker exec -u clightning stacker_cln lightning-cli newaddr bech32
CLN_ADDR=bcrt1q02sqd74l4pxedy24fg0qtjz4y2jq7x4lxlgzrx STACKER_CLN_ADDR=bcrt1q02sqd74l4pxedy24fg0qtjz4y2jq7x4lxlgzrx
CLN_PUBKEY=03ca7acec181dbf5e427c682c4261a46a0dd9ea5f35d97acb094e399f727835b90 STACKER_CLN_PUBKEY=03ca7acec181dbf5e427c682c4261a46a0dd9ea5f35d97acb094e399f727835b90
# sndev cli eclair getnewaddress
# sndev cli eclair getinfo
ECLAIR_ADDR="bcrt1qdus2yml69wsax3unz8pts9h979lc3s4tw0tpf6"
ECLAIR_PUBKEY="02268c74cc07837041131474881f97d497706b89a29f939555da6d094b65bd5af0"
# router lnd container stuff
ROUTER_LND_REST_PORT=8082
ROUTER_LND_GRPC_PORT=10011
# docker exec -u lnd router_lnd lncli newaddress p2wkh --unused
ROUTER_LND_ADDR=bcrt1qfkmwfpwgn6wt0dd36s79x04swz8vleyafsdpdr
ROUTER_LND_PUBKEY=02750991fbf62e57631888bc469fae69c5e658bd1d245d8ab95ed883517caa33c3
LNCLI_NETWORK=regtest LNCLI_NETWORK=regtest
@ -171,16 +155,8 @@ AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
PERSISTENCE=1 PERSISTENCE=1
SKIP_SSL_CERT_DOWNLOAD=1 SKIP_SSL_CERT_DOWNLOAD=1
# tor proxy # tor
TOR_PROXY=http://tor:7050/ TOR_PROXY=http://127.0.0.1:7050/
grpc_proxy=http://tor:7050/
# lnbits # lnbits
LNBITS_WEB_PORT=5001 LNBITS_WEB_PORT=5001
# CPU shares for each category
CPU_SHARES_IMPORTANT=1024
CPU_SHARES_MODERATE=512
CPU_SHARES_LOW=256
NEXT_TELEMETRY_DISABLED=1

View File

@ -21,5 +21,4 @@ PRISMA_SLOW_LOGS_MS=50
GRAPHQL_SLOW_LOGS_MS=50 GRAPHQL_SLOW_LOGS_MS=50
DB_APP_CONNECTION_LIMIT=4 DB_APP_CONNECTION_LIMIT=4
DB_WORKER_CONNECTION_LIMIT=2 DB_WORKER_CONNECTION_LIMIT=2
DB_TRANSACTION_TIMEOUT=10000 DB_TRANSACTION_TIMEOUT=10000
NEXT_TELEMETRY_DISABLED=1

35
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@ -0,0 +1,35 @@
---
name: Bug report
about: Report a problem
title: ''
labels: bug
assignees: ''
---
*Note: this template is meant to help you report the bug so that we can fix it faster, ie not all of these sections are required*
**Description**
A clear and concise description of what the bug is.
**Steps to Reproduce**
A clear and concise way we might be able to reproduce the bug.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Logs**
If applicable, add your browsers console logs.
**Environment:**
If you only experience the issue on certain devices or browsers, provide that info.
- Device: [e.g. iPhone6]
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.

View File

@ -1,65 +0,0 @@
name: 🐞 Bug report
description: Create a bug report to help us fix it
title: "bug report"
labels: [bug]
body:
- type: checkboxes
attributes:
label: Is there an existing issue for this?
description: Please search to see if an issue already exists for the bug you encountered.
options:
- label: I have searched the existing issues
required: true
- type: textarea
attributes:
label: Describe the bug
description: A clear and concise description of what the bug is. Include images if relevant.
placeholder: I accidentally deleted the internet. Here's my story ...
validations:
required: true
- type: textarea
attributes:
label: Screenshots
description: |
Add screenshots to help explain your problem. You can also add a video here.
Tip: You can attach images or video files by clicking this area to highlight it and then dragging files in.
validations:
required: false
- type: textarea
attributes:
label: Steps To Reproduce
description: Steps to reproduce the bug.
placeholder: |
1. Go to '...'
2. Click on '...'
3. Scroll to '...'
4. See error
validations:
required: true
- type: textarea
attributes:
label: Expected behavior
description: A clear and concise description of what you expected to happen
validations:
required: true
- type: textarea
attributes:
label: Logs
description: If applicable, add your browser's console logs here
- type: textarea
attributes:
label: Device information
placeholder: |
- OS: [e.g. Windows]
- Browser: [e.g. chrome, safari, firefox]
- Browser Version: [e.g. 22]
validations:
required: false
- type: textarea
attributes:
label: Additional context
description: |
Do you have links to discussions about this on SN or other references?
validations:
required: false

View File

@ -1,5 +0,0 @@
blank_issues_enabled: true
contact_links:
- name: Questions
url: https://stacker.news/~meta
about: If you simply have a question, you can ask it in ~meta or the saloon.

View File

@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest a feature
title: ''
labels: feature
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@ -1,32 +0,0 @@
name: ✨ Feature request
description: Request a feature you'd like to see in SN!
title: "feature request"
labels: [feature]
body:
- type: markdown
attributes:
value: |
We're always looking for suggestions on how we could improve SN!
- type: textarea
attributes:
label: Describe the problem you're trying to solve
description: |
Is your feature request related to a problem? Add a clear and concise description of what the problem is.
validations:
required: true
- type: textarea
attributes:
label: Describe the solution you'd like
description: A clear and concise description of what you want to happen.
validations:
required: true
- type: textarea
attributes:
label: Describe alternatives you've considered
description: |
A clear and concise description of any alternative solutions or features you have considered.
- type: textarea
attributes:
label: Additional context
description: |
Add any other additional context or screenshots about the feature request here.

View File

@ -1,22 +1,45 @@
## Description ## Description
_A clear and concise description of what you changed and why._ <!--
A clear and concise description of what you changed and why.
Don't forget to mention which tickets this closes (if any).
Use following syntax to close them automatically on merge: closes #<number>
-->
## Screenshots ## Screenshots
<!--
If your changes are user facing, please add screenshots of the new UI.
You can also create a video to showcase your changes (useful to show UX).
-->
## Additional Context ## Additional Context
_Was anything unclear during your work on this PR? Anything we should definitely take a closer look at?_ <!--
You can mention here anything that you think is relevant for this PR. Some examples:
* You encountered something that you didn't understand while working on this PR
* You were not sure about something you did but did not find a better way
* You initially had a different approach but went with a different approach for some reason
-->
## Checklist ## Checklist
**Are your changes backwards compatible? Please answer below:** **Are your changes backwards compatible? Please answer below:**
<!-- put your answer about backwards compatibility here -->
**On a scale of 1-10 how well and how have you QA'd this change and any features it might affect? Please answer below:** <!--
If your PR is not ready for review yet, please mark your PR as a draft.
If changes were requested, request a new review when you incorporated the feedback.
-->
**Did you QA this? Could we deploy this straight to production? Please answer below:**
<!-- put your answer about QA here -->
**For frontend changes: Tested on mobile, light and dark mode? Please answer below:** **For frontend changes: Tested on mobile? Please answer below:**
<!-- put your answer about mobile QA here -->
**Did you introduce any new environment variables? If so, call them out explicitly here:** **Did you introduce any new environment variables? If so, call them out explicitly here:**
<!-- put your answer about env vars here -->

View File

@ -1,35 +0,0 @@
name: extend-awards
on:
pull_request_target:
types: [ closed ]
branches:
- master
permissions:
pull-requests: write
contents: write
issues: read
jobs:
if_merged:
if: |
github.event_name == 'pull_request_target' &&
github.event.action == 'closed' &&
github.event.pull_request.merged == true &&
github.event.pull_request.head.ref != 'extend-awards/patch'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.13'
- run: pip install requests
- run: python extend-awards.py
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_CONTEXT: ${{ toJson(github) }}
- uses: peter-evans/create-pull-request@v7
with:
add-paths: awards.csv
branch: extend-awards/patch
commit-message: Extending awards.csv
title: Extending awards.csv
body: A PR was merged that solves an issue and awards.csv should be extended.

View File

@ -1,8 +1,8 @@
name: Lint Check name: Eslint Check
on: [pull_request] on: [pull_request]
jobs: jobs:
lint-run: eslint-run:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout - name: Checkout
@ -11,7 +11,7 @@ jobs:
- name: Setup Node.js - name: Setup Node.js
uses: actions/setup-node@v3 uses: actions/setup-node@v3
with: with:
node-version: "18.20.4" node-version: "18.17.0"
- name: Install - name: Install
run: npm install run: npm install

View File

@ -11,7 +11,7 @@ jobs:
- name: Setup Node.js - name: Setup Node.js
uses: actions/setup-node@v3 uses: actions/setup-node@v3
with: with:
node-version: "18.20.4" node-version: "18.17.0"
- name: Install - name: Install
run: npm install run: npm install

10
.gitignore vendored
View File

@ -56,13 +56,3 @@ docker-compose.*.yml
# nostr wallet connect # nostr wallet connect
scripts/nwc-keys.json scripts/nwc-keys.json
# lnbits
docker/lnbits/data
# lndk
!docker/lndk/tls-*.pem
# nostr link extract
scripts/nostr-link-extract.config.json
scripts/nostr-links.db

View File

@ -1,6 +1,6 @@
# syntax=docker/dockerfile:1 # syntax=docker/dockerfile:1
FROM node:18.20.4-bullseye FROM node:18.17.0-bullseye
ENV NODE_ENV=development ENV NODE_ENV=development

View File

@ -5,7 +5,7 @@
</p> </p>
- Stacker News is trying to fix online communities with economics - Stacker News makes internet communities that pay you Bitcoin
- What You See is What We Ship (look ma, I invented an initialism) - What You See is What We Ship (look ma, I invented an initialism)
- 100% FOSS - 100% FOSS
- We pay bitcoin for PRs, issues, documentation, code reviews and more - We pay bitcoin for PRs, issues, documentation, code reviews and more
@ -30,9 +30,7 @@ Go to [localhost:3000](http://localhost:3000).
- Clone the repo - Clone the repo
- ssh: `git clone git@github.com:stackernews/stacker.news.git` - ssh: `git clone git@github.com:stackernews/stacker.news.git`
- https: `git clone https://github.com/stackernews/stacker.news.git` - https: `git clone https://github.com/stackernews/stacker.news.git`
- Install [docker](https://docs.docker.com/compose/install/) - Install [docker](https://docs.docker.com/get-docker/)
- If you're running MacOS or Windows, I ***highly recommend*** using [OrbStack](https://orbstack.dev/) instead of Docker Desktop
- Please make sure that at least 10 GB of free space is available, otherwise you may encounter issues while setting up the development environment.
<br> <br>
@ -65,55 +63,64 @@ USAGE
$ sndev help [COMMAND] $ sndev help [COMMAND]
COMMANDS COMMANDS
help show help help show help
env: env:
start start env start start env
stop stop env stop stop env
restart restart env restart restart env
status status of env status status of env
logs logs from env logs logs from env
delete delete env delete delete env
sn: sn:
login login as a nym login login as a nym
set_balance set the balance of a nym
lightning: lnd:
fund pay a bolt11 for funding fund pay a bolt11 for funding
withdraw create a bolt11 for withdrawal withdraw create a bolt11 for withdrawal
cln:
cln_fund pay a bolt11 for funding with CLN
cln_withdraw create a bolt11 for withdrawal with CLN
db: db:
psql open psql on db psql open psql on db
prisma run prisma commands prisma run prisma commands
dev: dev:
pr fetch and checkout a pr pr fetch and checkout a pr
lint run linters lint run linters
test run tests
other: other:
cli service cli passthrough compose docker compose passthrough
open open service GUI in browser sn_lndcli lncli passthrough on sn_lnd
onion service onion address stacker_lndcli lncli passthrough on stacker_lnd
cert service tls cert stacker_clncli lightning-cli passthrough on stacker_cln
compose docker compose passthrough
``` ```
### Modifying services ### Modifying services
#### Running specific services #### Running specific services
By default all services will be run. If you want to exclude specific services from running, set `COMPOSE_PROFILES` in a `.env.local` file to one or more of `minimal,images,search,payments,wallets,email,capture`. To only run mininal necessary without things like payments in `.env.local`: By default all services will be run. If you want to exclude specific services from running, set `COMPOSE_PROFILES` to use one or more of `minimal|images|search|payments|wallets|email|capture`. To only run mininal services without images, search, email, wallets, or payments:
```.env ```sh
COMPOSE_PROFILES=minimal $ COMPOSE_PROFILES=minimal ./sndev start
```
Or, as I would recommend:
```sh
$ export COMPOSE_PROFILES=minimal
$ ./sndev start
``` ```
To run with images and payments services: To run with images and payments services:
```.env ```sh
COMPOSE_PROFILES=images,payments $ COMPOSE_PROFILES=images,payments ./sndev start
``` ```
#### Merging compose files #### Merging compose files
@ -226,7 +233,6 @@ _Due to Rule 3, make sure that you mark your PR as a draft when you create it an
| tag | multiplier | | tag | multiplier |
| ----------------- | ---------- | | ----------------- | ---------- |
| `priority:low` | 0.5 |
| `priority:medium` | 1.5 | | `priority:medium` | 1.5 |
| `priority:high` | 2 | | `priority:high` | 2 |
| `priority:urgent` | 3 | | `priority:urgent` | 3 |
@ -364,11 +370,9 @@ You can connect to the local database via `./sndev psql`. [psql](https://www.pos
<br> <br>
## Running cli on local lightning nodes ## Running lncli on the local lnd nodes
You can run `lncli` on the local lnd nodes via `./sndev cli lnd` and `./sndev cli sn_lnd`. The node for your local SN instance is `sn_lnd` and the node serving as any external node, like a stacker's node or external wallet, is `lnd`. You can run `lncli` on the local lnd nodes via `./sndev sn_lncli` and `./sndev stacker_lncli`. The node for your local SN instance is `sn_lnd` and the node serving as any external node, like a stacker's node or external wallet, is `stacker_lnd`.
You can run `lightning-cli` on the local cln node via `./sndev cli cln` which serves as an external node or wallet.
<br> <br>
@ -427,7 +431,7 @@ GITHUB_SECRET=<Client secret>
## Enabling web push notifications ## Enabling web push notifications
To enable Web Push locally, you will need to set the `VAPID_*` env vars. `VAPID_MAILTO` needs to be an email address using the `mailto:` scheme. For `NEXT_PUBLIC_VAPID_PUBKEY` and `VAPID_PRIVKEY`, you can run `npx web-push generate-vapid-keys`. To enable Web Push locally, you will need to set the `VAPID_*` env vars. `VAPID_MAILTO` needs to be an email address using the `mailto:` scheme. For `NEXT_PUBLIC_VAPID_KEY` and `VAPID_PRIVKEY`, you can run `npx web-push generate-vapid-keys`.
<br> <br>
@ -455,9 +459,7 @@ In addition, we run other critical services the above services interact with lik
## Wallet transaction safety ## Wallet transaction safety
To ensure stackers balances are kept sane, some wallet updates are run in [serializable transactions](https://www.postgresql.org/docs/current/transaction-iso.html#XACT-SERIALIZABLE) at the database level. Because early versions of prisma had relatively poor support for transactions most wallet touching code is written in [plpgsql](https://www.postgresql.org/docs/current/plpgsql.html) stored procedures and can be found in the `prisma/migrations` folder. To ensure stackers balances are kept sane, all wallet updates are run in [serializable transactions](https://www.postgresql.org/docs/current/transaction-iso.html#XACT-SERIALIZABLE) at the database level. Because early versions of prisma had relatively poor support for transactions most wallet touching code is written in [plpgsql](https://www.postgresql.org/docs/current/plpgsql.html) stored procedures and can be found in the `prisma/migrations` folder.
*UPDATE*: Most wallet updates are now run in [read committed](https://www.postgresql.org/docs/current/transaction-iso.html#XACT-READ-COMMITTED) transactions. See `api/paidAction/README.md` for more information.
<br> <br>

View File

@ -1,20 +1,13 @@
import { cachedFetcher } from '@/lib/fetch' import lndService from 'ln-service'
import { toPositiveNumber } from '@/lib/format'
import { authenticatedLndGrpc } from '@/lib/lnd'
import { getIdentity, getHeight, getWalletInfo, getNode, getPayment, parsePaymentRequest } from 'ln-service'
import { datePivot } from '@/lib/time'
import { LND_PATHFINDING_TIMEOUT_MS } from '@/lib/constants'
const lnd = global.lnd || authenticatedLndGrpc({ const { lnd } = lndService.authenticatedLndGrpc({
cert: process.env.LND_CERT, cert: process.env.LND_CERT,
macaroon: process.env.LND_MACAROON, macaroon: process.env.LND_MACAROON,
socket: process.env.LND_SOCKET socket: process.env.LND_SOCKET
}).lnd })
if (process.env.NODE_ENV === 'development') global.lnd = lnd
// Check LND GRPC connection // Check LND GRPC connection
getWalletInfo({ lnd }, (err, result) => { lndService.getWalletInfo({ lnd }, (err, result) => {
if (err) { if (err) {
console.error('LND GRPC connection error') console.error('LND GRPC connection error')
return return
@ -22,181 +15,4 @@ getWalletInfo({ lnd }, (err, result) => {
console.log('LND GRPC connection successful') console.log('LND GRPC connection successful')
}) })
export async function estimateRouteFee ({ lnd, destination, tokens, mtokens, request, timeout }) {
// if the payment request includes us as route hint, we needd to use the destination and amount
// otherwise, this will fail with a self-payment error
if (request) {
const inv = parsePaymentRequest({ request })
const ourPubkey = await getOurPubkey({ lnd })
if (Array.isArray(inv.routes)) {
for (const route of inv.routes) {
if (Array.isArray(route)) {
for (const hop of route) {
if (hop.public_key === ourPubkey) {
console.log('estimateRouteFee ignoring self-payment route')
request = false
break
}
}
}
}
}
}
return await new Promise((resolve, reject) => {
const params = {}
if (request) {
console.log('estimateRouteFee using payment request')
params.payment_request = request
} else {
console.log('estimateRouteFee using destination and amount')
params.dest = Buffer.from(destination, 'hex')
params.amt_sat = tokens ? toPositiveNumber(tokens) : toPositiveNumber(BigInt(mtokens) / BigInt(1e3))
}
lnd.router.estimateRouteFee({
...params,
timeout
}, (err, res) => {
if (err) {
if (res?.failure_reason) {
reject(new Error(`Unable to estimate route: ${res.failure_reason}`))
} else {
reject(err)
}
return
}
if (res.routing_fee_msat < 0 || res.time_lock_delay <= 0) {
reject(new Error('Unable to estimate route, excessive values: ' + JSON.stringify(res)))
return
}
resolve({
routingFeeMsat: toPositiveNumber(res.routing_fee_msat),
timeLockDelay: toPositiveNumber(res.time_lock_delay)
})
})
})
}
// created_height is the accepted_height, timeout is the expiry height
// ln-service remaps the `htlcs` field of lookupInvoice to `payments` and
// see: https://github.com/alexbosworth/lightning/blob/master/lnd_responses/htlc_as_payment.js
// and: https://lightning.engineering/api-docs/api/lnd/lightning/lookup-invoice/index.html#lnrpcinvoicehtlc
export function hodlInvoiceCltvDetails (inv) {
if (!inv.payments) {
throw new Error('No payments found')
}
if (!inv.is_held) {
throw new Error('Invoice is not held')
}
const acceptHeight = inv.payments.reduce((max, htlc) => {
const createdHeight = toPositiveNumber(htlc.created_height)
return createdHeight > max ? createdHeight : max
}, 0)
const expiryHeight = inv.payments.reduce((min, htlc) => {
const timeout = toPositiveNumber(htlc.timeout)
return timeout < min ? timeout : min
}, Number.MAX_SAFE_INTEGER)
return {
expiryHeight: toPositiveNumber(expiryHeight),
acceptHeight: toPositiveNumber(acceptHeight)
}
}
export function getPaymentFailureStatus (withdrawal) {
if (withdrawal && !withdrawal.is_failed) {
throw new Error('withdrawal is not failed')
}
if (withdrawal?.failed?.is_insufficient_balance) {
return {
status: 'INSUFFICIENT_BALANCE',
message: 'you didn\'t have enough sats'
}
} else if (withdrawal?.failed?.is_invalid_payment) {
return {
status: 'INVALID_PAYMENT',
message: 'invalid payment'
}
} else if (withdrawal?.failed?.is_pathfinding_timeout) {
return {
status: 'PATHFINDING_TIMEOUT',
message: 'no route found'
}
} else if (withdrawal?.failed?.is_route_not_found) {
return {
status: 'ROUTE_NOT_FOUND',
message: 'no route found'
}
}
return {
status: 'UNKNOWN_FAILURE',
message: 'unknown failure'
}
}
export const getBlockHeight = cachedFetcher(async function fetchBlockHeight ({ lnd, ...args }) {
try {
const { current_block_height: height } = await getHeight({ lnd, ...args })
return height
} catch (err) {
throw new Error(`Unable to fetch block height: ${err.message}`)
}
}, {
maxSize: 1,
cacheExpiry: 60 * 1000, // 1 minute
forceRefreshThreshold: 5 * 60 * 1000, // 5 minutes
keyGenerator: () => 'getHeight'
})
export const getOurPubkey = cachedFetcher(async function fetchOurPubkey ({ lnd, ...args }) {
try {
const identity = await getIdentity({ lnd, ...args })
return identity.public_key
} catch (err) {
throw new Error(`Unable to fetch identity: ${err.message}`)
}
}, {
maxSize: 1,
cacheExpiry: 0, // never expire
forceRefreshThreshold: 0, // never force refresh
keyGenerator: () => 'getOurPubkey'
})
export const getNodeSockets = cachedFetcher(async function fetchNodeSockets ({ lnd, ...args }) {
try {
return (await getNode({ lnd, is_omitting_channels: true, ...args }))?.sockets
} catch (err) {
throw new Error(`Unable to fetch node info: ${err.message}`)
}
}, {
maxSize: 100,
cacheExpiry: 1000 * 60 * 60 * 24, // 1 day
forceRefreshThreshold: 1000 * 60 * 60 * 24 * 7, // 1 week
keyGenerator: (args) => {
const { public_key: publicKey } = args
return publicKey
}
})
export async function getPaymentOrNotSent ({ id, lnd, createdAt }) {
try {
return await getPayment({ id, lnd })
} catch (err) {
if (err[1] === 'SentPaymentNotFound' &&
createdAt < datePivot(new Date(), { milliseconds: -LND_PATHFINDING_TIMEOUT_MS * 2 })) {
// if the payment is older than 2x timeout, but not found in LND, we can assume it errored before lnd stored it
return { notSent: true, is_failed: true }
} else {
throw err
}
}
}
export default lnd export default lnd

3
api/package.json Normal file
View File

@ -0,0 +1,3 @@
{
"type": "module"
}

View File

@ -2,38 +2,6 @@
Paid actions are actions that require payments to perform. Given that we support several payment flows, some of which require more than one round of communication either with LND or the client, and several paid actions, we have this plugin-like interface to easily add new paid actions. Paid actions are actions that require payments to perform. Given that we support several payment flows, some of which require more than one round of communication either with LND or the client, and several paid actions, we have this plugin-like interface to easily add new paid actions.
<details>
<summary>internals</summary>
All paid action progress, regardless of flow, is managed using a state machine that's transitioned by the invoice progress and payment progress (in the case of p2p paid action). Below is the full state machine for paid actions:
```mermaid
stateDiagram-v2
[*] --> PENDING
PENDING --> PAID
PENDING --> CANCELING
PENDING --> FAILED
PAID --> [*]
CANCELING --> FAILED
FAILED --> RETRYING
FAILED --> [*]
RETRYING --> [*]
[*] --> PENDING_HELD
PENDING_HELD --> HELD
PENDING_HELD --> FORWARDING
PENDING_HELD --> CANCELING
PENDING_HELD --> FAILED
HELD --> PAID
HELD --> CANCELING
HELD --> FAILED
FORWARDING --> FORWARDED
FORWARDING --> FAILED_FORWARD
FORWARDED --> PAID
FAILED_FORWARD --> CANCELING
FAILED_FORWARD --> FAILED
```
</details>
## Payment Flows ## Payment Flows
There are three payment flows: There are three payment flows:
@ -49,20 +17,11 @@ For paid actions that support it, if the stacker doesn't have enough fee credits
<details> <details>
<summary>Internals</summary> <summary>Internals</summary>
Internally, optimistic flows make use of a state machine that's transitioned by the invoice payment progress. Internally, optimistic flows make use of a state machine that's transitioned by the invoice payment progress. All optimistic actions start in a `PENDING` state and have the following transitions:
```mermaid - `PENDING` -> `PAID`: when the invoice is paid
stateDiagram-v2 - `PENDING` -> `FAILED`: when the invoice expires or is cancelled
[*] --> PENDING - `FAILED` -> `RETRYING`: when the invoice for the action is replaced with a new invoice
PENDING --> PAID
PENDING --> CANCELING
PENDING --> FAILED
PAID --> [*]
CANCELING --> FAILED
FAILED --> RETRYING
FAILED --> [*]
RETRYING --> [*]
```
</details> </details>
### Pessimistic ### Pessimistic
@ -73,68 +32,27 @@ Internally, pessimistic flows use hold invoices. If the action doesn't succeed,
<details> <details>
<summary>Internals</summary> <summary>Internals</summary>
Internally, pessimistic flows make use of a state machine that's transitioned by the invoice payment progress much like optimistic flows, but with extra steps. Internally, pessimistic flows make use of a state machine that's transitioned by the invoice payment progress much like optimistic flows, but with extra steps. All pessimistic actions start in a `PENDING_HELD` state and has the following transitions:
```mermaid - `PENDING_HELD` -> `HELD`: when the invoice is paid and the action's `perform` is run and the invoice is settled
stateDiagram-v2 - `HELD` -> `PAID`: when the action's `onPaid` is called
PAID --> [*] - `PENDING_HELD` -> `FAILED`: when the invoice for the action expires or is cancelled
CANCELING --> FAILED - `HELD` -> `FAILED`: when the action fails after the invoice is paid
FAILED --> [*]
[*] --> PENDING_HELD
PENDING_HELD --> HELD
PENDING_HELD --> CANCELING
PENDING_HELD --> FAILED
HELD --> PAID
HELD --> CANCELING
HELD --> FAILED
```
</details> </details>
### Table of existing paid actions and their supported flows ### Table of existing paid actions and their supported flows
| action | fee credits | optimistic | pessimistic | anonable | qr payable | p2p wrapped | side effects | reward sats | p2p direct | | action | fee credits | optimistic | pessimistic | anonable | qr payable | p2p wrapped | side effects |
| ----------------- | ----------- | ---------- | ----------- | -------- | ---------- | ----------- | ------------ | ----------- | ---------- | | ----------------- | ----------- | ---------- | ----------- | -------- | ---------- | ----------- | ------------ |
| zaps | x | x | x | x | x | x | x | | | | zaps | x | x | x | x | x | x | x |
| posts | x | x | x | x | x | | x | x | | | posts | x | x | x | x | x | | x |
| comments | x | x | x | x | x | | x | x | | | comments | x | x | x | x | x | | x |
| downzaps | x | x | | | x | | x | x | | | downzaps | x | x | | | x | | x |
| poll votes | x | x | | | x | | | x | | | poll votes | x | x | | | x | | |
| territory actions | x | | x | | x | | | x | | | territory actions | x | | x | | x | | |
| donations | x | | x | x | x | | | x | | | donations | x | | x | x | x | | |
| update posts | x | | x | | x | | x | x | | | update posts | x | | x | | x | | x |
| update comments | x | | x | | x | | x | x | | | update comments | x | | x | | x | | x |
| receive | | x | | | x | x | x | | x |
| buy fee credits | | | x | | x | | | x | |
| invite gift | x | | | | | | x | x | |
## Not-custodial zaps (ie p2p wrapped payments)
Zaps, and possibly other future actions, can be performed peer to peer and non-custodially. This means that the payment is made directly from the client to the recipient, without the server taking custody of the funds. Currently, in order to trigger this behavior, the recipient must have a receiving wallet attached and the sender must have insufficient funds in their custodial wallet to perform the requested zap.
This works by requesting an invoice from the recipient's wallet and reusing the payment hash in a hold invoice paid to SN (to collect the sybil fee) which we serve to the sender. When the sender pays this wrapped invoice, we forward our own money to the recipient, who then reveals the preimage to us, allowing us to settle the wrapped invoice and claim the sender's funds. This effectively does what a lightning node does when forwarding a payment but allows us to do it at the application layer.
<details>
<summary>Internals</summary>
Internally, p2p wrapped payments make use of the same paid action state machine but it's transitioned by both the incoming invoice payment progress *and* the outgoing invoice payment progress.
```mermaid
stateDiagram-v2
PAID --> [*]
CANCELING --> FAILED
FAILED --> RETRYING
FAILED --> [*]
RETRYING --> [*]
[*] --> PENDING_HELD
PENDING_HELD --> FORWARDING
PENDING_HELD --> CANCELING
PENDING_HELD --> FAILED
FORWARDING --> FORWARDED
FORWARDING --> FAILED_FORWARD
FORWARDED --> PAID
FAILED_FORWARD --> CANCELING
FAILED_FORWARD --> FAILED
```
</details>
## Paid Action Interface ## Paid Action Interface
@ -142,16 +60,10 @@ Each paid action is implemented in its own file in the `paidAction` directory. E
### Boolean flags ### Boolean flags
- `anonable`: can be performed anonymously - `anonable`: can be performed anonymously
- `supportsPessimism`: supports a pessimistic payment flow
- `supportsOptimism`: supports an optimistic payment flow
### Payment methods #### Functions
- `paymentMethods`: an array of payment methods that the action supports ordered from most preferred to least preferred
- P2P: a p2p payment made directly from the client to the recipient
- after wrapping the invoice, anonymous users will follow a PESSIMISTIC flow to pay the invoice and logged in users will follow an OPTIMISTIC flow
- FEE_CREDIT: a payment made from the user's fee credit balance
- OPTIMISTIC: an optimistic payment flow
- PESSIMISTIC: a pessimistic payment flow
### Functions
All functions have the following signature: `function(args: Object, context: Object): Promise` All functions have the following signature: `function(args: Object, context: Object): Promise`
@ -163,11 +75,7 @@ All functions have the following signature: `function(args: Object, context: Obj
- it can optionally store in the invoice with the `invoiceId` the `actionId` to be able to link the action with the invoice regardless of retries - it can optionally store in the invoice with the `invoiceId` the `actionId` to be able to link the action with the invoice regardless of retries
- `onPaid`: called when the action is paid - `onPaid`: called when the action is paid
- if the action does not support optimism, this function is optional - if the action does not support optimism, this function is optional
- this function should be used to mark the rows created in `perform` as `PAID` and perform critical side effects of the action (like denormalizations) - this function should be used to mark the rows created in `perform` as `PAID` and perform any other side effects of the action (like notifications or denormalizations)
- `nonCriticalSideEffects`: called after the action is paid to run any side effects whose failure does not affect the action's execution
- this function is always optional
- it's passed the result of the action (or the action's paid invoice) and the current context
- this is where things like push notifications should be handled
- `onFail`: called when the action fails - `onFail`: called when the action fails
- if the action does not support optimism, this function is optional - if the action does not support optimism, this function is optional
- this function should be used to mark the rows created in `perform` as `FAILED` - this function should be used to mark the rows created in `perform` as `FAILED`
@ -176,11 +84,8 @@ All functions have the following signature: `function(args: Object, context: Obj
- this function is called when an optimistic action is retried - this function is called when an optimistic action is retried
- it's passed the original `invoiceId` and the `newInvoiceId` - it's passed the original `invoiceId` and the `newInvoiceId`
- this function should update the rows created in `perform` to contain the new `newInvoiceId` and remark the row as `PENDING` - this function should update the rows created in `perform` to contain the new `newInvoiceId` and remark the row as `PENDING`
- `getInvoiceablePeer`: returns the userId of the peer that's capable of generating an invoice so they can be paid for the action
- this is only used for p2p wrapped zaps currently
- `describe`: returns a description as a string of the action - `describe`: returns a description as a string of the action
- for actions that require generating an invoice, and for stackers that don't hide invoice descriptions, this is used in the invoice description - for actions that require generating an invoice, and for stackers that don't hide invoice descriptions, this is used in the invoice description
- `getSybilFeePercent` (required if `getInvoiceablePeer` is implemented): returns the action sybil fee percent as a `BigInt` (eg. 30n for 30%)
#### Function arguments #### Function arguments
@ -189,17 +94,10 @@ All functions have the following signature: `function(args: Object, context: Obj
`context` contains the following fields: `context` contains the following fields:
- `me`: the user performing the action (undefined if anonymous) - `me`: the user performing the action (undefined if anonymous)
- `cost`: the cost of the action in msats as a `BigInt` - `cost`: the cost of the action in msats as a `BigInt`
- `sybilFeePercent`: the sybil fee percent as a `BigInt` (eg. 30n for 30%)
- `tx`: the current transaction (for anything that needs to be done atomically with the payment) - `tx`: the current transaction (for anything that needs to be done atomically with the payment)
- `models`: the current prisma client (for anything that doesn't need to be done atomically with the payment) - `models`: the current prisma client (for anything that doesn't need to be done atomically with the payment)
- `lnd`: the current lnd client - `lnd`: the current lnd client
## Recording Cowboy Credits
To avoid adding sats and credits together everywhere to show an aggregate sat value, in most cases we denormalize a `sats` field that carries the "sats value", the combined sats + credits of something, and a `credits` field that carries only the earned `credits`. For example, the `Item` table has an `msats` field that carries the sum of the `mcredits` and `msats` earned and a `mcredits` field that carries the value of the `mcredits` earned. So, the sats value an item earned is `item.msats` BUT the real sats earned is `item.msats - item.mcredits`.
The ONLY exception to this are for the `users` table where we store a stacker's rewards sats and credits balances separately.
## `IMPORTANT: transaction isolation` ## `IMPORTANT: transaction isolation`
We use a `read committed` isolation level for actions. This means paid actions need to be mindful of concurrency issues. Specifically, reading data from the database and then writing it back in `read committed` is a common source of consistency bugs (aka serialization anamolies). We use a `read committed` isolation level for actions. This means paid actions need to be mindful of concurrency issues. Specifically, reading data from the database and then writing it back in `read committed` is a common source of consistency bugs (aka serialization anamolies).
@ -250,7 +148,7 @@ COMMIT;
-- item_zaps.sats is 100, but we would expect it to be 200 -- item_zaps.sats is 100, but we would expect it to be 200
``` ```
Note that row level locks wouldn't help in this case, because we can't lock the rows that the transactions don't know to exist yet. Note that row level locks wouldn't help in this case, because we can't lock the rows that the transactions doesn't know to exist yet.
#### Subqueries are still incorrect #### Subqueries are still incorrect
@ -303,69 +201,4 @@ From the [postgres docs](https://www.postgresql.org/docs/current/transaction-iso
> UPDATE, DELETE, SELECT FOR UPDATE, and SELECT FOR SHARE commands behave the same as SELECT in terms of searching for target rows: they will only find target rows that were committed as of the command start time. However, such a target row might have already been updated (or deleted or locked) by another concurrent transaction by the time it is found. In this case, the would-be updater will wait for the first updating transaction to commit or roll back (if it is still in progress). If the first updater rolls back, then its effects are negated and the second updater can proceed with updating the originally found row. If the first updater commits, the second updater will ignore the row if the first updater deleted it, otherwise it will attempt to apply its operation to the updated version of the row. The search condition of the command (the WHERE clause) is re-evaluated to see if the updated version of the row still matches the search condition. If so, the second updater proceeds with its operation using the updated version of the row. In the case of SELECT FOR UPDATE and SELECT FOR SHARE, this means it is the updated version of the row that is locked and returned to the client. > UPDATE, DELETE, SELECT FOR UPDATE, and SELECT FOR SHARE commands behave the same as SELECT in terms of searching for target rows: they will only find target rows that were committed as of the command start time. However, such a target row might have already been updated (or deleted or locked) by another concurrent transaction by the time it is found. In this case, the would-be updater will wait for the first updating transaction to commit or roll back (if it is still in progress). If the first updater rolls back, then its effects are negated and the second updater can proceed with updating the originally found row. If the first updater commits, the second updater will ignore the row if the first updater deleted it, otherwise it will attempt to apply its operation to the updated version of the row. The search condition of the command (the WHERE clause) is re-evaluated to see if the updated version of the row still matches the search condition. If so, the second updater proceeds with its operation using the updated version of the row. In the case of SELECT FOR UPDATE and SELECT FOR SHARE, this means it is the updated version of the row that is locked and returned to the client.
From the [postgres source docs](https://git.postgresql.org/gitweb/?p=postgresql.git;a=blob;f=src/backend/executor/README#l350): From the [postgres source docs](https://git.postgresql.org/gitweb/?p=postgresql.git;a=blob;f=src/backend/executor/README#l350):
> It is also possible that there are relations in the query that are not to be locked (they are neither the UPDATE/DELETE/MERGE target nor specified to be locked in SELECT FOR UPDATE/SHARE). When re-running the test query ***we want to use the same rows*** from these relations that were joined to the locked rows. > It is also possible that there are relations in the query that are not to be locked (they are neither the UPDATE/DELETE/MERGE target nor specified to be locked in SELECT FOR UPDATE/SHARE). When re-running the test query ***we want to use the same rows*** from these relations that were joined to the locked rows.
## `IMPORTANT: deadlocks`
Deadlocks can occur when two transactions are waiting for each other to release locks. This can happen when two transactions lock rows in different orders whether explicit or implicit.
If both transactions lock the rows in the same order, the deadlock is avoided.
### Incorrect
```sql
-- transaction 1
BEGIN;
UPDATE users set msats = msats + 1 WHERE id = 1;
-- transaction 2
BEGIN;
UPDATE users set msats = msats + 1 WHERE id = 2;
-- transaction 1 (blocks here until transaction 2 commits)
UPDATE users set msats = msats + 1 WHERE id = 2;
-- transaction 2 (blocks here until transaction 1 commits)
UPDATE users set msats = msats + 1 WHERE id = 1;
-- deadlock occurs because neither transaction can proceed to here
```
In practice, this most often occurs when selecting multiple rows for update in different orders. Recently, we had a deadlock when spliting zaps to multiple users. The solution was to select the rows for update in the same order.
### Incorrect
```sql
WITH forwardees AS (
SELECT "userId", (($1::BIGINT * pct) / 100)::BIGINT AS msats
FROM "ItemForward"
WHERE "itemId" = $2::INTEGER
),
UPDATE users
SET
msats = users.msats + forwardees.msats,
"stackedMsats" = users."stackedMsats" + forwardees.msats
FROM forwardees
WHERE users.id = forwardees."userId";
```
If forwardees are selected in a different order in two concurrent transactions, e.g. (1,2) in tx 1 and (2,1) in tx 2, a deadlock can occur. To avoid this, always select rows for update in the same order.
### Correct
We fixed the deadlock by selecting the forwardees in the same order in these transactions.
```sql
WITH forwardees AS (
SELECT "userId", (($1::BIGINT * pct) / 100)::BIGINT AS msats
FROM "ItemForward"
WHERE "itemId" = $2::INTEGER
ORDER BY "userId" ASC
),
UPDATE users
SET
msats = users.msats + forwardees.msats,
"stackedMsats" = users."stackedMsats" + forwardees.msats
FROM forwardees
WHERE users.id = forwardees."userId";
```
### More resources
- https://www.postgresql.org/docs/current/explicit-locking.html#LOCKING-DEADLOCKS

View File

@ -1,82 +0,0 @@
import { PAID_ACTION_PAYMENT_METHODS } from '@/lib/constants'
import { msatsToSats, satsToMsats } from '@/lib/format'
export const anonable = false
export const paymentMethods = [
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.OPTIMISTIC
]
export async function getCost ({ sats }) {
return satsToMsats(sats)
}
export async function perform ({ invoiceId, sats, id: itemId, ...args }, { me, cost, tx }) {
itemId = parseInt(itemId)
let invoiceData = {}
if (invoiceId) {
invoiceData = { invoiceId, invoiceActionState: 'PENDING' }
// store a reference to the item in the invoice
await tx.invoice.update({
where: { id: invoiceId },
data: { actionId: itemId }
})
}
const act = await tx.itemAct.create({ data: { msats: cost, itemId, userId: me.id, act: 'BOOST', ...invoiceData } })
const [{ path }] = await tx.$queryRaw`
SELECT ltree2text(path) as path FROM "Item" WHERE id = ${itemId}::INTEGER`
return { id: itemId, sats, act: 'BOOST', path, actId: act.id }
}
export async function retry ({ invoiceId, newInvoiceId }, { tx, cost }) {
await tx.itemAct.updateMany({ where: { invoiceId }, data: { invoiceId: newInvoiceId, invoiceActionState: 'PENDING' } })
const [{ id, path }] = await tx.$queryRaw`
SELECT "Item".id, ltree2text(path) as path
FROM "Item"
JOIN "ItemAct" ON "Item".id = "ItemAct"."itemId"
WHERE "ItemAct"."invoiceId" = ${newInvoiceId}::INTEGER`
return { id, sats: msatsToSats(cost), act: 'BOOST', path }
}
export async function onPaid ({ invoice, actId }, { tx }) {
let itemAct
if (invoice) {
await tx.itemAct.updateMany({
where: { invoiceId: invoice.id },
data: {
invoiceActionState: 'PAID'
}
})
itemAct = await tx.itemAct.findFirst({ where: { invoiceId: invoice.id } })
} else if (actId) {
itemAct = await tx.itemAct.findFirst({ where: { id: actId } })
} else {
throw new Error('No invoice or actId')
}
// increment boost on item
await tx.item.update({
where: { id: itemAct.itemId },
data: {
boost: { increment: msatsToSats(itemAct.msats) }
}
})
await tx.$executeRaw`
INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter, keepuntil)
VALUES ('expireBoost', jsonb_build_object('id', ${itemAct.itemId}::INTEGER), 21, true,
now() + interval '30 days', now() + interval '40 days')`
}
export async function onFail ({ invoice }, { tx }) {
await tx.itemAct.updateMany({ where: { invoiceId: invoice.id }, data: { invoiceActionState: 'FAILED' } })
}
export async function describe ({ id: itemId, sats }, { actionId, cost }) {
return `SN: boost ${sats ?? msatsToSats(cost)} sats to #${itemId ?? actionId}`
}

View File

@ -1,32 +1,26 @@
import { PAID_ACTION_PAYMENT_METHODS } from '@/lib/constants' // XXX we don't use this yet ...
// it's just showing that even buying credits
// can eventually be a paid action
import { USER_ID } from '@/lib/constants'
import { satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
export const anonable = false export const anonable = false
export const supportsPessimism = false
export const supportsOptimism = true
export const paymentMethods = [ export async function getCost ({ amount }) {
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS, return satsToMsats(amount)
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export async function getCost ({ credits }) {
return satsToMsats(credits)
} }
export async function perform ({ credits }, { me, cost, tx }) { export async function onPaid ({ invoice }, { tx }) {
await tx.user.update({ return await tx.users.update({
where: { id: me.id }, where: { id: invoice.userId },
data: { data: { balance: { increment: invoice.msatsReceived } }
mcredits: {
increment: cost
}
}
}) })
return {
credits
}
} }
export async function describe () { export async function describe ({ amount }, { models, me }) {
return 'SN: buy fee credits' const user = await models.user.findUnique({ where: { id: me?.id ?? USER_ID.anon } })
return `SN: buying credits for @${user.name}`
} }

View File

@ -1,13 +1,9 @@
import { PAID_ACTION_PAYMENT_METHODS, USER_ID } from '@/lib/constants' import { USER_ID } from '@/lib/constants'
import { satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
export const anonable = true export const anonable = true
export const supportsPessimism = true
export const paymentMethods = [ export const supportsOptimism = false
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export async function getCost ({ sats }) { export async function getCost ({ sats }) {
return satsToMsats(sats) return satsToMsats(sats)

View File

@ -1,14 +1,8 @@
import { PAID_ACTION_PAYMENT_METHODS } from '@/lib/constants'
import { msatsToSats, satsToMsats } from '@/lib/format' import { msatsToSats, satsToMsats } from '@/lib/format'
import { Prisma } from '@prisma/client'
export const anonable = false export const anonable = false
export const supportsPessimism = false
export const paymentMethods = [ export const supportsOptimism = true
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.OPTIMISTIC
]
export async function getCost ({ sats }) { export async function getCost ({ sats }) {
return satsToMsats(sats) return satsToMsats(sats)
@ -49,9 +43,9 @@ export async function onPaid ({ invoice, actId }, { tx }) {
let itemAct let itemAct
if (invoice) { if (invoice) {
await tx.itemAct.updateMany({ where: { invoiceId: invoice.id }, data: { invoiceActionState: 'PAID' } }) await tx.itemAct.updateMany({ where: { invoiceId: invoice.id }, data: { invoiceActionState: 'PAID' } })
itemAct = await tx.itemAct.findFirst({ where: { invoiceId: invoice.id }, include: { item: true } }) itemAct = await tx.itemAct.findFirst({ where: { invoiceId: invoice.id } })
} else if (actId) { } else if (actId) {
itemAct = await tx.itemAct.findUnique({ where: { id: actId }, include: { item: true } }) itemAct = await tx.itemAct.findUnique({ where: { id: actId } })
} else { } else {
throw new Error('No invoice or actId') throw new Error('No invoice or actId')
} }
@ -61,40 +55,25 @@ export async function onPaid ({ invoice, actId }, { tx }) {
// denormalize downzaps // denormalize downzaps
await tx.$executeRaw` await tx.$executeRaw`
WITH territory AS ( WITH zapper AS (
SELECT COALESCE(r."subName", i."subName", 'meta')::TEXT as "subName" SELECT trust FROM users WHERE id = ${itemAct.userId}::INTEGER
FROM "Item" i ), zap AS (
LEFT JOIN "Item" r ON r.id = i."rootId" INSERT INTO "ItemUserAgg" ("userId", "itemId", "downZapSats")
WHERE i.id = ${itemAct.itemId}::INTEGER VALUES (${itemAct.userId}::INTEGER, ${itemAct.itemId}::INTEGER, ${sats}::INTEGER)
), zapper AS ( ON CONFLICT ("itemId", "userId") DO UPDATE
SELECT SET "downZapSats" = "ItemUserAgg"."downZapSats" + ${sats}::INTEGER, updated_at = now()
COALESCE(${itemAct.item.parentId RETURNING LOG("downZapSats" / GREATEST("downZapSats" - ${sats}::INTEGER, 1)::FLOAT) AS log_sats
? Prisma.sql`"zapCommentTrust"` )
: Prisma.sql`"zapPostTrust"`}, 0) as "zapTrust", UPDATE "Item"
COALESCE(${itemAct.item.parentId SET "weightedDownVotes" = "weightedDownVotes" + (zapper.trust * zap.log_sats)
? Prisma.sql`"subZapCommentTrust"` FROM zap, zapper
: Prisma.sql`"subZapPostTrust"`}, 0) as "subZapTrust" WHERE "Item".id = ${itemAct.itemId}::INTEGER`
FROM territory
LEFT JOIN "UserSubTrust" ust ON ust."subName" = territory."subName"
AND ust."userId" = ${itemAct.userId}::INTEGER
), zap AS (
INSERT INTO "ItemUserAgg" ("userId", "itemId", "downZapSats")
VALUES (${itemAct.userId}::INTEGER, ${itemAct.itemId}::INTEGER, ${sats}::INTEGER)
ON CONFLICT ("itemId", "userId") DO UPDATE
SET "downZapSats" = "ItemUserAgg"."downZapSats" + ${sats}::INTEGER, updated_at = now()
RETURNING LOG("downZapSats" / GREATEST("downZapSats" - ${sats}::INTEGER, 1)::FLOAT) AS log_sats
)
UPDATE "Item"
SET "weightedDownVotes" = "weightedDownVotes" + zapper."zapTrust" * zap.log_sats,
"subWeightedDownVotes" = "subWeightedDownVotes" + zapper."subZapTrust" * zap.log_sats
FROM zap, zapper
WHERE "Item".id = ${itemAct.itemId}::INTEGER`
} }
export async function onFail ({ invoice }, { tx }) { export async function onFail ({ invoice }, { tx }) {
await tx.itemAct.updateMany({ where: { invoiceId: invoice.id }, data: { invoiceActionState: 'FAILED' } }) await tx.itemAct.updateMany({ where: { invoiceId: invoice.id }, data: { invoiceActionState: 'FAILED' } })
} }
export async function describe ({ id: itemId, sats }, { cost, actionId }) { export async function describe ({ itemId, sats }, { cost, actionId }) {
return `SN: downzap of ${sats ?? msatsToSats(cost)} sats to #${itemId ?? actionId}` return `SN: downzap of ${sats ?? msatsToSats(cost)} sats to #${itemId ?? actionId}`
} }

View File

@ -1,11 +1,8 @@
import { createHodlInvoice, createInvoice, parsePaymentRequest } from 'ln-service' import { createHodlInvoice, createInvoice } from 'ln-service'
import { datePivot } from '@/lib/time' import { datePivot } from '@/lib/time'
import { PAID_ACTION_PAYMENT_METHODS, USER_ID } from '@/lib/constants' import { USER_ID } from '@/lib/constants'
import { createHmac } from '@/api/resolvers/wallet' import { createHmac } from '../resolvers/wallet'
import { Prisma } from '@prisma/client' import { Prisma } from '@prisma/client'
import { createWrappedInvoice, createUserInvoice } from '@/wallets/server'
import { assertBelowMaxPendingInvoices, assertBelowMaxPendingDirectPayments } from './lib/assert'
import * as ITEM_CREATE from './itemCreate' import * as ITEM_CREATE from './itemCreate'
import * as ITEM_UPDATE from './itemUpdate' import * as ITEM_UPDATE from './itemUpdate'
import * as ZAP from './zap' import * as ZAP from './zap'
@ -16,31 +13,23 @@ import * as TERRITORY_UPDATE from './territoryUpdate'
import * as TERRITORY_BILLING from './territoryBilling' import * as TERRITORY_BILLING from './territoryBilling'
import * as TERRITORY_UNARCHIVE from './territoryUnarchive' import * as TERRITORY_UNARCHIVE from './territoryUnarchive'
import * as DONATE from './donate' import * as DONATE from './donate'
import * as BOOST from './boost'
import * as RECEIVE from './receive'
import * as BUY_CREDITS from './buyCredits'
import * as INVITE_GIFT from './inviteGift'
export const paidActions = { export const paidActions = {
ITEM_CREATE, ITEM_CREATE,
ITEM_UPDATE, ITEM_UPDATE,
ZAP, ZAP,
DOWN_ZAP, DOWN_ZAP,
BOOST,
POLL_VOTE, POLL_VOTE,
TERRITORY_CREATE, TERRITORY_CREATE,
TERRITORY_UPDATE, TERRITORY_UPDATE,
TERRITORY_BILLING, TERRITORY_BILLING,
TERRITORY_UNARCHIVE, TERRITORY_UNARCHIVE,
DONATE, DONATE
RECEIVE,
BUY_CREDITS,
INVITE_GIFT
} }
export default async function performPaidAction (actionType, args, incomingContext) { export default async function performPaidAction (actionType, args, context) {
try { try {
const { me, models, forcePaymentMethod } = incomingContext const { me, models, forceFeeCredits } = context
const paidAction = paidActions[actionType] const paidAction = paidActions[actionType]
console.group('performPaidAction', actionType, args) console.group('performPaidAction', actionType, args)
@ -49,85 +38,49 @@ export default async function performPaidAction (actionType, args, incomingConte
throw new Error(`Invalid action type ${actionType}`) throw new Error(`Invalid action type ${actionType}`)
} }
if (!me && !paidAction.anonable) { context.me = me ? await models.user.findUnique({ where: { id: me.id } }) : undefined
throw new Error('You must be logged in to perform this action') context.cost = await paidAction.getCost(args, context)
}
// treat context as immutable if (!me) {
const contextWithMe = { if (!paidAction.anonable) {
...incomingContext, throw new Error('You must be logged in to perform this action')
me: me ? await models.user.findUnique({ where: { id: parseInt(me.id) } }) : undefined
}
const context = {
...contextWithMe,
cost: await paidAction.getCost(args, contextWithMe),
sybilFeePercent: await paidAction.getSybilFeePercent?.(args, contextWithMe)
}
// special case for zero cost actions
if (context.cost === 0n) {
console.log('performing zero cost action')
return await performNoInvoiceAction(actionType, args, { ...context, paymentMethod: 'ZERO_COST' })
}
for (const paymentMethod of paidAction.paymentMethods) {
console.log(`considering payment method ${paymentMethod}`)
const contextWithPaymentMethod = { ...context, paymentMethod }
if (forcePaymentMethod &&
paymentMethod !== forcePaymentMethod) {
console.log('skipping payment method', paymentMethod, 'because forcePaymentMethod is set to', forcePaymentMethod)
continue
} }
// payment methods that anonymous users can use console.log('we are anon so can only perform pessimistic action')
if (paymentMethod === PAID_ACTION_PAYMENT_METHODS.P2P) { return await performPessimisticAction(actionType, args, context)
try { }
return await performP2PAction(actionType, args, contextWithPaymentMethod)
} catch (e) { const isRich = context.cost <= context.me.msats
if (e instanceof NonInvoiceablePeerError) { if (isRich) {
console.log('peer cannot be invoiced, skipping') try {
continue console.log('enough fee credits available, performing fee credit action')
} return await performFeeCreditAction(actionType, args, context)
console.error(`${paymentMethod} action failed`, e) } catch (e) {
console.error('fee credit action failed', e)
// if we fail with fee credits, but not because of insufficient funds, bail
if (!e.message.includes('\\"users\\" violates check constraint \\"msats_positive\\"')) {
throw e throw e
} }
} else if (paymentMethod === PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC) {
return await beginPessimisticAction(actionType, args, contextWithPaymentMethod)
}
// additional payment methods that logged in users can use
if (me) {
if (paymentMethod === PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT ||
paymentMethod === PAID_ACTION_PAYMENT_METHODS.REWARD_SATS) {
try {
return await performNoInvoiceAction(actionType, args, contextWithPaymentMethod)
} catch (e) {
// if we fail with fee credits or reward sats, but not because of insufficient funds, bail
console.error(`${paymentMethod} action failed`, e)
if (!e.message.includes('\\"users\\" violates check constraint \\"msats_positive\\"') &&
!e.message.includes('\\"users\\" violates check constraint \\"mcredits_positive\\"')) {
throw e
}
}
} else if (paymentMethod === PAID_ACTION_PAYMENT_METHODS.OPTIMISTIC) {
return await performOptimisticAction(actionType, args, contextWithPaymentMethod)
} else if (paymentMethod === PAID_ACTION_PAYMENT_METHODS.DIRECT) {
try {
return await performDirectAction(actionType, args, contextWithPaymentMethod)
} catch (e) {
if (e instanceof NonInvoiceablePeerError) {
console.log('peer cannot be invoiced, skipping')
continue
}
console.error(`${paymentMethod} action failed`, e)
throw e
}
}
} }
} }
throw new Error('No working payment method found') // this is set if the worker executes a paid action in behalf of a user.
// in that case, only payment via fee credits is possible
// since there is no client to which we could send an invoice.
// example: automated territory billing
if (forceFeeCredits) {
throw new Error('forceFeeCredits is set, but user does not have enough fee credits')
}
// if we fail to do the action with fee credits, we should fall back to optimistic
if (paidAction.supportsOptimism) {
console.log('performing optimistic action')
return await performOptimisticAction(actionType, args, context)
}
console.error('action does not support optimism and fee credits failed, performing pessimistic action')
return await performPessimisticAction(actionType, args, context)
} catch (e) { } catch (e) {
console.error('performPaidAction failed', e) console.error('performPaidAction failed', e)
throw e throw e
@ -136,53 +89,43 @@ export default async function performPaidAction (actionType, args, incomingConte
} }
} }
async function performNoInvoiceAction (actionType, args, incomingContext) { async function performFeeCreditAction (actionType, args, context) {
const { me, models, cost, paymentMethod } = incomingContext const { me, models, cost } = context
const action = paidActions[actionType] const action = paidActions[actionType]
const result = await models.$transaction(async tx => { return await models.$transaction(async tx => {
const context = { ...incomingContext, tx } context.tx = tx
if (paymentMethod === 'FEE_CREDIT') { await tx.user.update({
await tx.user.update({ where: {
where: { id: me.id
id: me?.id ?? USER_ID.anon },
}, data: {
data: { mcredits: { decrement: cost } } msats: {
}) decrement: cost
} else if (paymentMethod === PAID_ACTION_PAYMENT_METHODS.REWARD_SATS) { }
await tx.user.update({ }
where: { })
id: me?.id ?? USER_ID.anon
},
data: { msats: { decrement: cost } }
})
}
const result = await action.perform(args, context) const result = await action.perform(args, context)
await action.onPaid?.(result, context) await action.onPaid?.(result, context)
return { return {
result, result,
paymentMethod paymentMethod: 'FEE_CREDIT'
} }
}, { isolationLevel: Prisma.TransactionIsolationLevel.ReadCommitted }) }, { isolationLevel: Prisma.TransactionIsolationLevel.ReadCommitted })
// run non critical side effects in the background
// after the transaction has been committed
action.nonCriticalSideEffects?.(result.result, incomingContext).catch(console.error)
return result
} }
async function performOptimisticAction (actionType, args, incomingContext) { async function performOptimisticAction (actionType, args, context) {
const { models, invoiceArgs: incomingInvoiceArgs } = incomingContext const { models } = context
const action = paidActions[actionType] const action = paidActions[actionType]
const optimisticContext = { ...incomingContext, optimistic: true } context.optimistic = true
const invoiceArgs = incomingInvoiceArgs ?? await createSNInvoice(actionType, args, optimisticContext) context.lndInvoice = await createLndInvoice(actionType, args, context)
return await models.$transaction(async tx => { return await models.$transaction(async tx => {
const context = { ...optimisticContext, tx, invoiceArgs } context.tx = tx
const invoice = await createDbInvoice(actionType, args, context) const invoice = await createDbInvoice(actionType, args, context)
@ -194,128 +137,24 @@ async function performOptimisticAction (actionType, args, incomingContext) {
}, { isolationLevel: Prisma.TransactionIsolationLevel.ReadCommitted }) }, { isolationLevel: Prisma.TransactionIsolationLevel.ReadCommitted })
} }
async function beginPessimisticAction (actionType, args, context) { async function performPessimisticAction (actionType, args, context) {
const action = paidActions[actionType] const action = paidActions[actionType]
if (!action.paymentMethods.includes(PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC)) { if (!action.supportsPessimism) {
throw new Error(`This action ${actionType} does not support pessimistic invoicing`) throw new Error(`This action ${actionType} does not support pessimistic invoicing`)
} }
// just create the invoice and complete action when it's paid // just create the invoice and complete action when it's paid
const invoiceArgs = context.invoiceArgs ?? await createSNInvoice(actionType, args, context) context.lndInvoice = await createLndInvoice(actionType, args, context)
return { return {
invoice: await createDbInvoice(actionType, args, { ...context, invoiceArgs }), invoice: await createDbInvoice(actionType, args, context),
paymentMethod: 'PESSIMISTIC' paymentMethod: 'PESSIMISTIC'
} }
} }
async function performP2PAction (actionType, args, incomingContext) { export async function retryPaidAction (actionType, args, context) {
// if the action has an invoiceable peer, we'll create a peer invoice const { models, me } = context
// wrap it, and return the wrapped invoice const { invoiceId } = args
const { cost, sybilFeePercent, models, lnd, me } = incomingContext
if (!sybilFeePercent) {
throw new Error('sybil fee percent is not set for an invoiceable peer action')
}
const userId = await paidActions[actionType]?.getInvoiceablePeer?.(args, incomingContext)
if (!userId) {
throw new NonInvoiceablePeerError()
}
let context
try {
await assertBelowMaxPendingInvoices(incomingContext)
const description = await paidActions[actionType].describe(args, incomingContext)
const { invoice, wrappedInvoice, wallet, maxFee } = await createWrappedInvoice(userId, {
msats: cost,
feePercent: sybilFeePercent,
description,
expiry: INVOICE_EXPIRE_SECS
}, { models, me, lnd })
context = {
...incomingContext,
invoiceArgs: {
bolt11: invoice,
wrappedBolt11: wrappedInvoice,
wallet,
maxFee
}
}
} catch (e) {
console.error('failed to create wrapped invoice', e)
throw new NonInvoiceablePeerError()
}
return me
? await performOptimisticAction(actionType, args, context)
: await beginPessimisticAction(actionType, args, context)
}
// we don't need to use the module for perform-ing outside actions
// because we can't track the state of outside invoices we aren't paid/paying
async function performDirectAction (actionType, args, incomingContext) {
const { models, lnd, cost } = incomingContext
const { comment, lud18Data, noteStr, description: actionDescription } = args
const userId = await paidActions[actionType]?.getInvoiceablePeer?.(args, incomingContext)
if (!userId) {
throw new NonInvoiceablePeerError()
}
try {
await assertBelowMaxPendingDirectPayments(userId, incomingContext)
const description = actionDescription ?? await paidActions[actionType].describe(args, incomingContext)
for await (const { invoice, logger, wallet } of createUserInvoice(userId, {
msats: cost,
description,
expiry: INVOICE_EXPIRE_SECS
}, { models, lnd })) {
let hash
try {
hash = parsePaymentRequest({ request: invoice }).id
} catch (e) {
console.error('failed to parse invoice', e)
logger?.error('failed to parse invoice: ' + e.message, { bolt11: invoice })
continue
}
try {
return {
invoice: await models.directPayment.create({
data: {
comment,
lud18Data,
desc: noteStr,
bolt11: invoice,
msats: cost,
hash,
walletId: wallet.id,
receiverId: userId
}
}),
paymentMethod: 'DIRECT'
}
} catch (e) {
console.error('failed to create direct payment', e)
logger?.error('failed to create direct payment: ' + e.message, { bolt11: invoice })
}
}
} catch (e) {
console.error('failed to create user invoice', e)
}
throw new NonInvoiceablePeerError()
}
export async function retryPaidAction (actionType, args, incomingContext) {
const { models, me } = incomingContext
const { invoice: failedInvoice } = args
console.log('retryPaidAction', actionType, args)
const action = paidActions[actionType] const action = paidActions[actionType]
if (!action) { if (!action) {
@ -326,56 +165,32 @@ export async function retryPaidAction (actionType, args, incomingContext) {
throw new Error(`retryPaidAction - must be logged in ${actionType}`) throw new Error(`retryPaidAction - must be logged in ${actionType}`)
} }
if (!failedInvoice) { if (!action.supportsOptimism) {
throw new Error(`retryPaidAction - missing invoice ${actionType}`) throw new Error(`retryPaidAction - action does not support optimism ${actionType}`)
} }
const { msatsRequested, actionId, actionArgs, actionOptimistic } = failedInvoice if (!action.retry) {
const retryContext = { throw new Error(`retryPaidAction - action does not support retrying ${actionType}`)
...incomingContext,
optimistic: actionOptimistic,
me: await models.user.findUnique({ where: { id: parseInt(me.id) } }),
cost: BigInt(msatsRequested),
actionId,
predecessorId: failedInvoice.id
} }
let invoiceArgs if (!invoiceId) {
const invoiceForward = await models.invoiceForward.findUnique({ throw new Error(`retryPaidAction - missing invoiceId ${actionType}`)
where: {
invoiceId: failedInvoice.id
},
include: {
wallet: true
}
})
if (invoiceForward) {
// this is a wrapped invoice, we need to retry it with receiver fallbacks
try {
const { userId } = invoiceForward.wallet
// this will return an invoice from the first receiver wallet that didn't fail yet and throw if none is available
const { invoice: bolt11, wrappedInvoice: wrappedBolt11, wallet, maxFee } = await createWrappedInvoice(userId, {
msats: failedInvoice.msatsRequested,
feePercent: await action.getSybilFeePercent?.(actionArgs, retryContext),
description: await action.describe?.(actionArgs, retryContext),
expiry: INVOICE_EXPIRE_SECS
}, retryContext)
invoiceArgs = { bolt11, wrappedBolt11, wallet, maxFee }
} catch (err) {
console.log('failed to retry wrapped invoice, falling back to SN:', err)
}
} }
invoiceArgs ??= await createSNInvoice(actionType, actionArgs, retryContext) context.optimistic = true
context.me = await models.user.findUnique({ where: { id: me.id } })
const { msatsRequested } = await models.invoice.findUnique({ where: { id: invoiceId, actionState: 'FAILED' } })
context.cost = BigInt(msatsRequested)
context.lndInvoice = await createLndInvoice(actionType, args, context)
return await models.$transaction(async tx => { return await models.$transaction(async tx => {
const context = { ...retryContext, tx, invoiceArgs } context.tx = tx
// update the old invoice to RETRYING, so that it's not confused with FAILED // update the old invoice to RETRYING, so that it's not confused with FAILED
await tx.invoice.update({ const { actionId } = await tx.invoice.update({
where: { where: {
id: failedInvoice.id, id: invoiceId,
actionState: 'FAILED' actionState: 'FAILED'
}, },
data: { data: {
@ -383,109 +198,80 @@ export async function retryPaidAction (actionType, args, incomingContext) {
} }
}) })
context.actionId = actionId
// create a new invoice // create a new invoice
const invoice = await createDbInvoice(actionType, actionArgs, context) const invoice = await createDbInvoice(actionType, args, context)
return { return {
result: await action.retry?.({ invoiceId: failedInvoice.id, newInvoiceId: invoice.id }, context), result: await action.retry({ invoiceId, newInvoiceId: invoice.id }, context),
invoice, invoice,
paymentMethod: actionOptimistic ? 'OPTIMISTIC' : 'PESSIMISTIC' paymentMethod: 'OPTIMISTIC'
} }
}, { isolationLevel: Prisma.TransactionIsolationLevel.ReadCommitted }) }, { isolationLevel: Prisma.TransactionIsolationLevel.ReadCommitted })
} }
const INVOICE_EXPIRE_SECS = 600 const OPTIMISTIC_INVOICE_EXPIRE = { minutes: 10 }
const PESSIMISTIC_INVOICE_EXPIRE = { minutes: 10 }
export class NonInvoiceablePeerError extends Error {
constructor () {
super('non invoiceable peer')
this.name = 'NonInvoiceablePeerError'
}
}
// we seperate the invoice creation into two functions because // we seperate the invoice creation into two functions because
// because if lnd is slow, it'll timeout the interactive tx // because if lnd is slow, it'll timeout the interactive tx
async function createSNInvoice (actionType, args, context) { async function createLndInvoice (actionType, args, context) {
const { me, lnd, cost, optimistic } = context const { me, lnd, cost, optimistic } = context
const action = paidActions[actionType] const action = paidActions[actionType]
const createLNDInvoice = optimistic ? createInvoice : createHodlInvoice const [createLNDInvoice, expirePivot] = optimistic
? [createInvoice, OPTIMISTIC_INVOICE_EXPIRE]
await assertBelowMaxPendingInvoices(context) : [createHodlInvoice, PESSIMISTIC_INVOICE_EXPIRE]
if (cost < 1000n) { if (cost < 1000n) {
// sanity check // sanity check
throw new Error('The cost of the action must be at least 1 sat') throw new Error('The cost of the action must be at least 1 sat')
} }
const expiresAt = datePivot(new Date(), { seconds: INVOICE_EXPIRE_SECS }) const expiresAt = datePivot(new Date(), expirePivot)
const invoice = await createLNDInvoice({ return await createLNDInvoice({
description: me?.hideInvoiceDesc ? undefined : await action.describe(args, context), description: me?.hideInvoiceDesc ? undefined : await action.describe(args, context),
lnd, lnd,
mtokens: String(cost), mtokens: String(cost),
expires_at: expiresAt expires_at: expiresAt
}) })
return { bolt11: invoice.request, preimage: invoice.secret }
} }
async function createDbInvoice (actionType, args, context) { async function createDbInvoice (actionType, args, context) {
const { me, models, tx, cost, optimistic, actionId, invoiceArgs, paymentAttempt, predecessorId } = context const { me, models, tx, lndInvoice, cost, optimistic, actionId } = context
const { bolt11, wrappedBolt11, preimage, wallet, maxFee } = invoiceArgs
const db = tx ?? models const db = tx ?? models
const [expirePivot, actionState] = optimistic
? [OPTIMISTIC_INVOICE_EXPIRE, 'PENDING']
: [PESSIMISTIC_INVOICE_EXPIRE, 'PENDING_HELD']
if (cost < 1000n) { if (cost < 1000n) {
// sanity check // sanity check
throw new Error('The cost of the action must be at least 1 sat') throw new Error('The cost of the action must be at least 1 sat')
} }
const servedBolt11 = wrappedBolt11 ?? bolt11 const expiresAt = datePivot(new Date(), expirePivot)
const servedInvoice = parsePaymentRequest({ request: servedBolt11 }) const invoice = await db.invoice.create({
const expiresAt = new Date(servedInvoice.expires_at) data: {
hash: lndInvoice.id,
const invoiceData = { msatsRequested: cost,
hash: servedInvoice.id, preimage: optimistic ? undefined : lndInvoice.secret,
msatsRequested: BigInt(servedInvoice.mtokens), bolt11: lndInvoice.request,
preimage, userId: me?.id ?? USER_ID.anon,
bolt11: servedBolt11, actionType,
userId: me?.id ?? USER_ID.anon, actionState,
actionType, actionArgs: args,
actionState: wrappedBolt11 ? 'PENDING_HELD' : optimistic ? 'PENDING' : 'PENDING_HELD', expiresAt,
actionOptimistic: optimistic, actionId
actionArgs: args, }
expiresAt, })
actionId,
paymentAttempt,
predecessorId
}
let invoice
if (wrappedBolt11) {
invoice = (await db.invoiceForward.create({
include: { invoice: true },
data: {
bolt11,
maxFeeMsats: maxFee,
invoice: {
create: invoiceData
},
wallet: {
connect: {
id: wallet.id
}
}
}
})).invoice
} else {
invoice = await db.invoice.create({ data: invoiceData })
}
// insert a job to check the invoice after it's set to expire // insert a job to check the invoice after it's set to expire
await db.$executeRaw` await db.$executeRaw`
INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter, keepuntil, priority) INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter, expirein, priority)
VALUES ('checkInvoice', VALUES ('checkInvoice',
jsonb_build_object('hash', ${invoice.hash}::TEXT), 21, true, jsonb_build_object('hash', ${lndInvoice.id}::TEXT), 21, true,
${expiresAt}::TIMESTAMP WITH TIME ZONE, ${expiresAt}::TIMESTAMP WITH TIME ZONE,
${expiresAt}::TIMESTAMP WITH TIME ZONE + interval '10m', 100)` ${expiresAt}::TIMESTAMP WITH TIME ZONE - now() + interval '10m', 100)`
// the HMAC is only returned during invoice creation // the HMAC is only returned during invoice creation
// this makes sure that only the person who created this invoice // this makes sure that only the person who created this invoice

View File

@ -1,60 +0,0 @@
import { PAID_ACTION_PAYMENT_METHODS } from '@/lib/constants'
import { satsToMsats } from '@/lib/format'
import { notifyInvite } from '@/lib/webPush'
export const anonable = false
export const paymentMethods = [
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS
]
export async function getCost ({ id }, { models, me }) {
const invite = await models.invite.findUnique({ where: { id, userId: me.id, revoked: false } })
if (!invite) {
throw new Error('invite not found')
}
return satsToMsats(invite.gift)
}
export async function perform ({ id, userId }, { me, cost, tx }) {
const invite = await tx.invite.findUnique({
where: { id, userId: me.id, revoked: false }
})
if (invite.limit && invite.giftedCount >= invite.limit) {
throw new Error('invite limit reached')
}
// check that user was created in last hour
// check that user did not already redeem an invite
await tx.user.update({
where: {
id: userId,
inviteId: null,
createdAt: {
gt: new Date(Date.now() - 1000 * 60 * 60)
}
},
data: {
mcredits: {
increment: cost
},
inviteId: id,
referrerId: me.id
}
})
return await tx.invite.update({
where: { id, userId: me.id, revoked: false, ...(invite.limit ? { giftedCount: { lt: invite.limit } } : {}) },
data: {
giftedCount: {
increment: 1
}
}
})
}
export async function nonCriticalSideEffects (_, { me }) {
notifyInvite(me.id)
}

View File

@ -1,57 +1,28 @@
import { ANON_ITEM_SPAM_INTERVAL, ITEM_SPAM_INTERVAL, PAID_ACTION_PAYMENT_METHODS, USER_ID } from '@/lib/constants' import { ANON_ITEM_SPAM_INTERVAL, ITEM_SPAM_INTERVAL, USER_ID } from '@/lib/constants'
import { notifyItemMention, notifyItemParents, notifyMention, notifyTerritorySubscribers, notifyUserSubscribers, notifyThreadSubscribers } from '@/lib/webPush' import { notifyItemMention, notifyItemParents, notifyMention, notifyTerritorySubscribers, notifyUserSubscribers } from '@/lib/webPush'
import { getItemMentions, getMentions, performBotBehavior } from './lib/item' import { getItemMentions, getMentions, performBotBehavior } from './lib/item'
import { msatsToSats, satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
import { GqlInputError } from '@/lib/error'
export const anonable = true export const anonable = true
export const supportsPessimism = true
export const paymentMethods = [ export const supportsOptimism = true
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.OPTIMISTIC,
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export const DEFAULT_ITEM_COST = 1000n
export async function getBaseCost ({ models, bio, parentId, subName }) {
if (bio) return DEFAULT_ITEM_COST
if (parentId) {
// the subname is stored in the root item of the thread
const [sub] = await models.$queryRaw`
SELECT s."replyCost"
FROM "Item" i
LEFT JOIN "Item" r ON r.id = i."rootId"
LEFT JOIN "Sub" s ON s.name = COALESCE(r."subName", i."subName")
WHERE i.id = ${Number(parentId)}`
if (sub?.replyCost) return satsToMsats(sub.replyCost)
return DEFAULT_ITEM_COST
}
const sub = await models.sub.findUnique({ where: { name: subName } })
return satsToMsats(sub.baseCost)
}
export async function getCost ({ subName, parentId, uploadIds, boost = 0, bio }, { models, me }) { export async function getCost ({ subName, parentId, uploadIds, boost = 0, bio }, { models, me }) {
const baseCost = await getBaseCost({ models, bio, parentId, subName }) const sub = (parentId || bio) ? null : await models.sub.findUnique({ where: { name: subName } })
const baseCost = sub ? satsToMsats(sub.baseCost) : 1000n
// cost = baseCost * 10^num_items_in_10m * 100 (anon) or 1 (user) + upload fees + boost // cost = baseCost * 10^num_items_in_10m * 100 (anon) or 1 (user) + image fees + boost
const [{ cost }] = await models.$queryRaw` const [{ cost }] = await models.$queryRaw`
SELECT ${baseCost}::INTEGER SELECT ${baseCost}::INTEGER
* POWER(10, item_spam(${parseInt(parentId)}::INTEGER, ${me?.id ?? USER_ID.anon}::INTEGER, * POWER(10, item_spam(${parseInt(parentId)}::INTEGER, ${me?.id ?? USER_ID.anon}::INTEGER,
${me?.id && !bio ? ITEM_SPAM_INTERVAL : ANON_ITEM_SPAM_INTERVAL}::INTERVAL)) ${me?.id && !bio ? ITEM_SPAM_INTERVAL : ANON_ITEM_SPAM_INTERVAL}::INTERVAL))
* ${me ? 1 : 100}::INTEGER * ${me ? 1 : 100}::INTEGER
+ (SELECT "nUnpaid" * "uploadFeesMsats" + (SELECT "nUnpaid" * "imageFeeMsats"
FROM upload_fees(${me?.id || USER_ID.anon}::INTEGER, ${uploadIds}::INTEGER[])) FROM image_fees_info(${me?.id || USER_ID.anon}::INTEGER, ${uploadIds}::INTEGER[]))
+ ${satsToMsats(boost)}::INTEGER as cost` + ${satsToMsats(boost)}::INTEGER as cost`
// sub allows freebies (or is a bio or a comment), cost is less than baseCost, not anon, // sub allows freebies (or is a bio or a comment), cost is less than baseCost, not anon, and cost must be greater than user's balance
// cost must be greater than user's balance, and user has not disabled freebies const freebie = (parentId || bio || sub?.allowFreebies) && cost <= baseCost && !!me && cost > me?.msats
const freebie = (parentId || bio) && cost <= baseCost && !!me &&
me?.msats < cost && !me?.disableFreebies && me?.mcredits < cost
return freebie ? BigInt(0) : BigInt(cost) return freebie ? BigInt(0) : BigInt(cost)
} }
@ -61,16 +32,6 @@ export async function perform (args, context) {
const { tx, me, cost } = context const { tx, me, cost } = context
const boostMsats = satsToMsats(boost) const boostMsats = satsToMsats(boost)
const deletedUploads = []
for (const uploadId of uploadIds) {
if (!await tx.upload.findUnique({ where: { id: uploadId } })) {
deletedUploads.push(uploadId)
}
}
if (deletedUploads.length > 0) {
throw new Error(`upload(s) ${deletedUploads.join(', ')} are expired, consider reuploading.`)
}
let invoiceData = {} let invoiceData = {}
if (invoiceId) { if (invoiceId) {
invoiceData = { invoiceId, invoiceActionState: 'PENDING' } invoiceData = { invoiceId, invoiceActionState: 'PENDING' }
@ -90,7 +51,8 @@ export async function perform (args, context) {
itemActs.push({ itemActs.push({
msats: cost - boostMsats, act: 'FEE', userId: data.userId, ...invoiceData msats: cost - boostMsats, act: 'FEE', userId: data.userId, ...invoiceData
}) })
data.cost = msatsToSats(cost - boostMsats) } else {
data.freebie = true
} }
const mentions = await getMentions(args, context) const mentions = await getMentions(args, context)
@ -160,15 +122,7 @@ export async function perform (args, context) {
} }
})).bio })).bio
} else { } else {
try { item = await tx.item.create({ data: itemData })
item = await tx.item.create({ data: itemData })
} catch (err) {
if (err.message.includes('violates exclusion constraint \\"Item_unique_time_constraint\\"')) {
const message = `you already submitted this ${itemData.title ? 'post' : 'comment'}`
throw new GqlInputError(message)
}
throw err
}
} }
// store a reference to the item in the invoice // store a reference to the item in the invoice
@ -199,13 +153,15 @@ export async function retry ({ invoiceId, newInvoiceId }, { tx }) {
} }
export async function onPaid ({ invoice, id }, context) { export async function onPaid ({ invoice, id }, context) {
const { tx } = context const { models, tx } = context
let item let item
if (invoice) { if (invoice) {
item = await tx.item.findFirst({ item = await tx.item.findFirst({
where: { invoiceId: invoice.id }, where: { invoiceId: invoice.id },
include: { include: {
mentions: true,
itemReferrers: { include: { refereeItem: true } },
user: true user: true
} }
}) })
@ -216,6 +172,8 @@ export async function onPaid ({ invoice, id }, context) {
item = await tx.item.findUnique({ item = await tx.item.findUnique({
where: { id }, where: { id },
include: { include: {
mentions: true,
itemReferrers: { include: { refereeItem: true } },
user: true, user: true,
itemUploads: { include: { upload: true } } itemUploads: { include: { upload: true } }
} }
@ -236,13 +194,6 @@ export async function onPaid ({ invoice, id }, context) {
INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter) INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter)
VALUES ('imgproxy', jsonb_build_object('id', ${item.id}::INTEGER), 21, true, now() + interval '5 seconds')` VALUES ('imgproxy', jsonb_build_object('id', ${item.id}::INTEGER), 21, true, now() + interval '5 seconds')`
if (item.boost > 0) {
await tx.$executeRaw`
INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter, keepuntil)
VALUES ('expireBoost', jsonb_build_object('id', ${item.id}::INTEGER), 21, true,
now() + interval '30 days', now() + interval '40 days')`
}
if (item.parentId) { if (item.parentId) {
// denormalize ncomments, lastCommentAt, and "weightedComments" for ancestors, and insert into reply table // denormalize ncomments, lastCommentAt, and "weightedComments" for ancestors, and insert into reply table
await tx.$executeRaw` await tx.$executeRaw`
@ -252,48 +203,30 @@ export async function onPaid ({ invoice, id }, context) {
JOIN users ON "Item"."userId" = users.id JOIN users ON "Item"."userId" = users.id
WHERE "Item".id = ${item.id}::INTEGER WHERE "Item".id = ${item.id}::INTEGER
), ancestors AS ( ), ancestors AS (
SELECT "Item".*
FROM "Item", comment
WHERE "Item".path @> comment.path AND "Item".id <> comment.id
ORDER BY "Item".id
), updated_ancestors AS (
UPDATE "Item" UPDATE "Item"
SET ncomments = "Item".ncomments + 1, SET ncomments = "Item".ncomments + 1,
"lastCommentAt" = GREATEST("Item"."lastCommentAt", comment.created_at), "lastCommentAt" = now(),
"nDirectComments" = "Item"."nDirectComments" + "weightedComments" = "Item"."weightedComments" +
CASE WHEN comment."parentId" = "Item".id THEN 1 ELSE 0 END CASE WHEN comment."userId" = "Item"."userId" THEN 0 ELSE comment.trust END
FROM comment, ancestors FROM comment
WHERE "Item".id = ancestors.id WHERE "Item".path @> comment.path AND "Item".id <> comment.id
RETURNING "Item".* RETURNING "Item".*
) )
INSERT INTO "Reply" (created_at, updated_at, "ancestorId", "ancestorUserId", "itemId", "userId", level) INSERT INTO "Reply" (created_at, updated_at, "ancestorId", "ancestorUserId", "itemId", "userId", level)
SELECT comment.created_at, comment.updated_at, ancestors.id, ancestors."userId", SELECT comment.created_at, comment.updated_at, ancestors.id, ancestors."userId",
comment.id, comment."userId", nlevel(comment.path) - nlevel(ancestors.path) comment.id, comment."userId", nlevel(comment.path) - nlevel(ancestors.path)
FROM ancestors, comment` FROM ancestors, comment
} WHERE ancestors."userId" <> comment."userId"`
}
export async function nonCriticalSideEffects ({ invoice, id }, { models }) {
const item = await models.item.findFirst({
where: invoice ? { invoiceId: invoice.id } : { id: parseInt(id) },
include: {
mentions: true,
itemReferrers: { include: { refereeItem: true } },
user: true
}
})
if (item.parentId) {
notifyItemParents({ item, models }).catch(console.error) notifyItemParents({ item, models }).catch(console.error)
notifyThreadSubscribers({ models, item }).catch(console.error)
} }
for (const { userId } of item.mentions) { for (const { userId } of item.mentions) {
notifyMention({ models, item, userId }).catch(console.error) notifyMention({ models, item, userId }).catch(console.error)
} }
for (const { refereeItem } of item.itemReferrers) { for (const { refereeItem } of item.itemReferrers) {
notifyItemMention({ models, referrerItem: item, refereeItem }).catch(console.error) notifyItemMention({ models, referrerItem: item, refereeItem }).catch(console.error)
} }
notifyUserSubscribers({ models, item }).catch(console.error) notifyUserSubscribers({ models, item }).catch(console.error)
notifyTerritorySubscribers({ models, item }).catch(console.error) notifyTerritorySubscribers({ models, item }).catch(console.error)
} }

View File

@ -1,34 +1,24 @@
import { PAID_ACTION_PAYMENT_METHODS, USER_ID } from '@/lib/constants' import { USER_ID } from '@/lib/constants'
import { uploadFees } from '../resolvers/upload' import { imageFeesInfo } from '../resolvers/image'
import { getItemMentions, getMentions, performBotBehavior } from './lib/item' import { getItemMentions, getMentions, performBotBehavior } from './lib/item'
import { notifyItemMention, notifyMention } from '@/lib/webPush' import { notifyItemMention, notifyMention } from '@/lib/webPush'
import { satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
export const anonable = true export const anonable = false
export const supportsPessimism = true
export const supportsOptimism = false
export const paymentMethods = [ export async function getCost ({ id, boost = 0, uploadIds }, { me, models }) {
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export async function getCost ({ id, boost = 0, uploadIds, bio }, { me, models }) {
// the only reason updating items costs anything is when it has new uploads // the only reason updating items costs anything is when it has new uploads
// or more boost // or more boost
const old = await models.item.findUnique({ where: { id: parseInt(id) } }) const old = await models.item.findUnique({ where: { id: parseInt(id) } })
const { totalFeesMsats } = await uploadFees(uploadIds, { models, me }) const { totalFeesMsats } = await imageFeesInfo(uploadIds, { models, me })
const cost = BigInt(totalFeesMsats) + satsToMsats(boost - old.boost) return BigInt(totalFeesMsats) + satsToMsats(boost - (old.boost || 0))
if (cost > 0 && old.invoiceActionState && old.invoiceActionState !== 'PAID') {
throw new Error('creation invoice not paid')
}
return cost
} }
export async function perform (args, context) { export async function perform (args, context) {
const { id, boost = 0, uploadIds = [], options: pollOptions = [], forwardUsers: itemForwards = [], ...data } = args const { id, boost = 0, uploadIds = [], options: pollOptions = [], forwardUsers: itemForwards = [], invoiceId, ...data } = args
const { tx, me } = context const { tx, me, models } = context
const old = await tx.item.findUnique({ const old = await tx.item.findUnique({
where: { id: parseInt(id) }, where: { id: parseInt(id) },
include: { include: {
@ -40,10 +30,9 @@ export async function perform (args, context) {
} }
}) })
const newBoost = boost - old.boost const boostMsats = satsToMsats(boost - (old.boost || 0))
const itemActs = [] const itemActs = []
if (newBoost > 0) { if (boostMsats > 0) {
const boostMsats = satsToMsats(newBoost)
itemActs.push({ itemActs.push({
msats: boostMsats, act: 'BOOST', userId: me?.id || USER_ID.anon msats: boostMsats, act: 'BOOST', userId: me?.id || USER_ID.anon
}) })
@ -65,15 +54,15 @@ export async function perform (args, context) {
data: { paid: true } data: { paid: true }
}) })
// we put boost in the where clause because we don't want to update the boost const item = await tx.item.update({
// if it has changed concurrently where: { id: parseInt(id) },
await tx.item.update({ include: {
where: { id: parseInt(id), boost: old.boost }, mentions: true,
itemReferrers: { include: { refereeItem: true } }
},
data: { data: {
...data, ...data,
boost: { boost,
increment: newBoost
},
pollOptions: { pollOptions: {
createMany: { createMany: {
data: pollOptions?.map(option => ({ option })) data: pollOptions?.map(option => ({ option }))
@ -137,35 +126,11 @@ export async function perform (args, context) {
} }
}) })
await tx.$executeRaw` await tx.$executeRaw`INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter)
INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter, keepuntil) VALUES ('imgproxy', jsonb_build_object('id', ${id}::INTEGER), 21, true, now() + interval '5 seconds')`
VALUES ('imgproxy', jsonb_build_object('id', ${id}::INTEGER), 21, true,
now() + interval '5 seconds', now() + interval '1 day')`
if (newBoost > 0) {
await tx.$executeRaw`
INSERT INTO pgboss.job (name, data, retrylimit, retrybackoff, startafter, keepuntil)
VALUES ('expireBoost', jsonb_build_object('id', ${id}::INTEGER), 21, true,
now() + interval '30 days', now() + interval '40 days')`
}
await performBotBehavior(args, context) await performBotBehavior(args, context)
// ltree is unsupported in Prisma, so we have to query it manually (FUCK!)
return (await tx.$queryRaw`
SELECT *, ltree2text(path) AS path, created_at AS "createdAt", updated_at AS "updatedAt"
FROM "Item" WHERE id = ${parseInt(id)}::INTEGER`
)[0]
}
export async function nonCriticalSideEffects ({ invoice, id }, { models }) {
const item = await models.item.findFirst({
where: invoice ? { invoiceId: invoice.id } : { id: parseInt(id) },
include: {
mentions: true,
itemReferrers: { include: { refereeItem: true } }
}
})
// compare timestamps to only notify if mention or item referral was just created to avoid duplicates on edits // compare timestamps to only notify if mention or item referral was just created to avoid duplicates on edits
for (const { userId, createdAt } of item.mentions) { for (const { userId, createdAt } of item.mentions) {
if (item.updatedAt.getTime() !== createdAt.getTime()) continue if (item.updatedAt.getTime() !== createdAt.getTime()) continue
@ -175,6 +140,12 @@ export async function nonCriticalSideEffects ({ invoice, id }, { models }) {
if (item.updatedAt.getTime() !== createdAt.getTime()) continue if (item.updatedAt.getTime() !== createdAt.getTime()) continue
notifyItemMention({ models, referrerItem: item, refereeItem }).catch(console.error) notifyItemMention({ models, referrerItem: item, refereeItem }).catch(console.error)
} }
// ltree is unsupported in Prisma, so we have to query it manually (FUCK!)
return (await tx.$queryRaw`
SELECT *, ltree2text(path) AS path, created_at AS "createdAt", updated_at AS "updatedAt"
FROM "Item" WHERE id = ${parseInt(id)}::INTEGER`
)[0]
} }
export async function describe ({ id, parentId }, context) { export async function describe ({ id, parentId }, context) {

View File

@ -1,56 +0,0 @@
import { PAID_ACTION_TERMINAL_STATES, USER_ID } from '@/lib/constants'
import { datePivot } from '@/lib/time'
const MAX_PENDING_PAID_ACTIONS_PER_USER = 100
const MAX_PENDING_DIRECT_INVOICES_PER_USER_MINUTES = 10
const MAX_PENDING_DIRECT_INVOICES_PER_USER = 100
export async function assertBelowMaxPendingInvoices (context) {
const { models, me } = context
const pendingInvoices = await models.invoice.count({
where: {
userId: me?.id ?? USER_ID.anon,
actionState: {
notIn: PAID_ACTION_TERMINAL_STATES
}
}
})
if (pendingInvoices >= MAX_PENDING_PAID_ACTIONS_PER_USER) {
throw new Error('You have too many pending paid actions, cancel some or wait for them to expire')
}
}
export async function assertBelowMaxPendingDirectPayments (userId, context) {
const { models, me } = context
if (me?.id !== userId) {
const pendingSenderInvoices = await models.directPayment.count({
where: {
senderId: me?.id ?? USER_ID.anon,
createdAt: {
gt: datePivot(new Date(), { minutes: -MAX_PENDING_DIRECT_INVOICES_PER_USER_MINUTES })
}
}
})
if (pendingSenderInvoices >= MAX_PENDING_DIRECT_INVOICES_PER_USER) {
throw new Error('You\'ve sent too many direct payments')
}
}
if (!userId) return
const pendingReceiverInvoices = await models.directPayment.count({
where: {
receiverId: userId,
createdAt: {
gt: datePivot(new Date(), { minutes: -MAX_PENDING_DIRECT_INVOICES_PER_USER_MINUTES })
}
}
})
if (pendingReceiverInvoices >= MAX_PENDING_DIRECT_INVOICES_PER_USER) {
throw new Error('Receiver has too many direct payments')
}
}

View File

@ -2,11 +2,11 @@ import { USER_ID } from '@/lib/constants'
import { deleteReminders, getDeleteAt, getRemindAt } from '@/lib/item' import { deleteReminders, getDeleteAt, getRemindAt } from '@/lib/item'
import { parseInternalLinks } from '@/lib/url' import { parseInternalLinks } from '@/lib/url'
export async function getMentions ({ text }, { me, tx }) { export async function getMentions ({ text }, { me, models }) {
const mentionPattern = /\B@[\w_]+/gi const mentionPattern = /\B@[\w_]+/gi
const names = text.match(mentionPattern)?.map(m => m.slice(1)) const names = text.match(mentionPattern)?.map(m => m.slice(1))
if (names?.length > 0) { if (names?.length > 0) {
const users = await tx.user.findMany({ const users = await models.user.findMany({
where: { where: {
name: { name: {
in: names in: names
@ -21,7 +21,7 @@ export async function getMentions ({ text }, { me, tx }) {
return [] return []
} }
export const getItemMentions = async ({ text }, { me, tx }) => { export const getItemMentions = async ({ text }, { me, models }) => {
const linkPattern = new RegExp(`${process.env.NEXT_PUBLIC_URL}/items/\\d+[a-zA-Z0-9/?=]*`, 'gi') const linkPattern = new RegExp(`${process.env.NEXT_PUBLIC_URL}/items/\\d+[a-zA-Z0-9/?=]*`, 'gi')
const refs = text.match(linkPattern)?.map(m => { const refs = text.match(linkPattern)?.map(m => {
try { try {
@ -33,7 +33,7 @@ export const getItemMentions = async ({ text }, { me, tx }) => {
}).filter(r => !!r) }).filter(r => !!r)
if (refs?.length > 0) { if (refs?.length > 0) {
const referee = await tx.item.findMany({ const referee = await models.item.findMany({
where: { where: {
id: { in: refs }, id: { in: refs },
userId: { not: me?.id || USER_ID.anon } userId: { not: me?.id || USER_ID.anon }
@ -60,23 +60,23 @@ export async function performBotBehavior ({ text, id }, { me, tx }) {
const deleteAt = getDeleteAt(text) const deleteAt = getDeleteAt(text)
if (deleteAt) { if (deleteAt) {
await tx.$queryRaw` await tx.$queryRaw`
INSERT INTO pgboss.job (name, data, startafter, keepuntil) INSERT INTO pgboss.job (name, data, startafter, expirein)
VALUES ( VALUES (
'deleteItem', 'deleteItem',
jsonb_build_object('id', ${id}::INTEGER), jsonb_build_object('id', ${id}::INTEGER),
${deleteAt}::TIMESTAMP WITH TIME ZONE, ${deleteAt}::TIMESTAMP WITH TIME ZONE,
${deleteAt}::TIMESTAMP WITH TIME ZONE + interval '1 minute')` ${deleteAt}::TIMESTAMP WITH TIME ZONE - now() + interval '1 minute')`
} }
const remindAt = getRemindAt(text) const remindAt = getRemindAt(text)
if (remindAt) { if (remindAt) {
await tx.$queryRaw` await tx.$queryRaw`
INSERT INTO pgboss.job (name, data, startafter, keepuntil) INSERT INTO pgboss.job (name, data, startafter, expirein)
VALUES ( VALUES (
'reminder', 'reminder',
jsonb_build_object('itemId', ${id}::INTEGER, 'userId', ${userId}::INTEGER), jsonb_build_object('itemId', ${id}::INTEGER, 'userId', ${userId}::INTEGER),
${remindAt}::TIMESTAMP WITH TIME ZONE, ${remindAt}::TIMESTAMP WITH TIME ZONE,
${remindAt}::TIMESTAMP WITH TIME ZONE + interval '1 minute')` ${remindAt}::TIMESTAMP WITH TIME ZONE - now() + interval '1 minute')`
await tx.reminder.create({ await tx.reminder.create({
data: { data: {
userId, userId,

View File

@ -1,27 +0,0 @@
import { USER_ID } from '@/lib/constants'
export const GLOBAL_SEEDS = [USER_ID.k00b, USER_ID.ek]
export function initialTrust ({ name, userId }) {
const results = GLOBAL_SEEDS.map(id => ({
subName: name,
userId: id,
zapPostTrust: 1,
subZapPostTrust: 1,
zapCommentTrust: 1,
subZapCommentTrust: 1
}))
if (!GLOBAL_SEEDS.includes(userId)) {
results.push({
subName: name,
userId,
zapPostTrust: 0,
subZapPostTrust: 1,
zapCommentTrust: 0,
subZapCommentTrust: 1
})
}
return results
}

View File

@ -1,13 +1,8 @@
import { PAID_ACTION_PAYMENT_METHODS } from '@/lib/constants'
import { satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
export const anonable = false export const anonable = false
export const supportsPessimism = true
export const paymentMethods = [ export const supportsOptimism = true
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.OPTIMISTIC
]
export async function getCost ({ id }, { me, models }) { export async function getCost ({ id }, { me, models }) {
const pollOption = await models.pollOption.findUnique({ const pollOption = await models.pollOption.findUnique({

View File

@ -1,84 +0,0 @@
import { PAID_ACTION_PAYMENT_METHODS } from '@/lib/constants'
import { toPositiveBigInt, numWithUnits, msatsToSats, satsToMsats } from '@/lib/format'
import { notifyDeposit } from '@/lib/webPush'
import { getInvoiceableWallets } from '@/wallets/server'
export const anonable = false
export const paymentMethods = [
PAID_ACTION_PAYMENT_METHODS.P2P,
PAID_ACTION_PAYMENT_METHODS.DIRECT
]
export async function getCost ({ msats }) {
return toPositiveBigInt(msats)
}
export async function getInvoiceablePeer (_, { me, models, cost, paymentMethod }) {
if (paymentMethod === PAID_ACTION_PAYMENT_METHODS.P2P && !me?.proxyReceive) return null
if (paymentMethod === PAID_ACTION_PAYMENT_METHODS.DIRECT && !me?.directReceive) return null
const wallets = await getInvoiceableWallets(me.id, { models })
if (wallets.length === 0) {
return null
}
if (cost < satsToMsats(me.receiveCreditsBelowSats)) {
return null
}
return me.id
}
export async function getSybilFeePercent () {
return 10n
}
export async function perform ({
invoiceId,
comment,
lud18Data,
noteStr
}, { me, tx }) {
return await tx.invoice.update({
where: { id: invoiceId },
data: {
comment,
lud18Data,
...(noteStr ? { desc: noteStr } : {})
},
include: { invoiceForward: true }
})
}
export async function describe ({ description }, { me, cost, paymentMethod, sybilFeePercent }) {
const fee = paymentMethod === PAID_ACTION_PAYMENT_METHODS.P2P
? cost * BigInt(sybilFeePercent) / 100n
: 0n
return description ?? `SN: ${me?.name ?? ''} receives ${numWithUnits(msatsToSats(cost - fee))}`
}
export async function onPaid ({ invoice }, { tx }) {
if (!invoice) {
throw new Error('invoice is required')
}
// P2P lnurlp does not need to update the user's balance
if (invoice?.invoiceForward) return
await tx.user.update({
where: { id: invoice.userId },
data: {
mcredits: {
increment: invoice.msatsReceived
}
}
})
}
export async function nonCriticalSideEffects ({ invoice }, { models }) {
await notifyDeposit(invoice.userId, invoice)
await models.$executeRaw`
INSERT INTO pgboss.job (name, data)
VALUES ('nip57', jsonb_build_object('hash', ${invoice.hash}))`
}

View File

@ -1,14 +1,10 @@
import { PAID_ACTION_PAYMENT_METHODS, TERRITORY_PERIOD_COST } from '@/lib/constants' import { TERRITORY_PERIOD_COST } from '@/lib/constants'
import { satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
import { nextBilling } from '@/lib/territory' import { nextBilling } from '@/lib/territory'
export const anonable = false export const anonable = false
export const supportsPessimism = true
export const paymentMethods = [ export const supportsOptimism = false
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export async function getCost ({ name }, { models }) { export async function getCost ({ name }, { models }) {
const sub = await models.sub.findUnique({ const sub = await models.sub.findUnique({

View File

@ -1,15 +1,9 @@
import { PAID_ACTION_PAYMENT_METHODS, TERRITORY_PERIOD_COST } from '@/lib/constants' import { TERRITORY_PERIOD_COST } from '@/lib/constants'
import { satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
import { nextBilling } from '@/lib/territory' import { nextBilling } from '@/lib/territory'
import { initialTrust } from './lib/territory'
export const anonable = false export const anonable = false
export const supportsPessimism = true
export const paymentMethods = [ export const supportsOptimism = false
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export async function getCost ({ billingType }) { export async function getCost ({ billingType }) {
return satsToMsats(TERRITORY_PERIOD_COST(billingType)) return satsToMsats(TERRITORY_PERIOD_COST(billingType))
@ -21,7 +15,7 @@ export async function perform ({ invoiceId, ...data }, { me, cost, tx }) {
const billedLastAt = new Date() const billedLastAt = new Date()
const billPaidUntil = nextBilling(billedLastAt, billingType) const billPaidUntil = nextBilling(billedLastAt, billingType)
const sub = await tx.sub.create({ return await tx.sub.create({
data: { data: {
...data, ...data,
billedLastAt, billedLastAt,
@ -43,12 +37,6 @@ export async function perform ({ invoiceId, ...data }, { me, cost, tx }) {
} }
} }
}) })
await tx.userSubTrust.createMany({
data: initialTrust({ name: sub.name, userId: sub.userId })
})
return sub
} }
export async function describe ({ name }) { export async function describe ({ name }) {

View File

@ -1,15 +1,10 @@
import { PAID_ACTION_PAYMENT_METHODS, TERRITORY_PERIOD_COST } from '@/lib/constants' import { TERRITORY_PERIOD_COST } from '@/lib/constants'
import { satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
import { nextBilling } from '@/lib/territory' import { nextBilling } from '@/lib/territory'
import { initialTrust } from './lib/territory'
export const anonable = false export const anonable = false
export const supportsPessimism = true
export const paymentMethods = [ export const supportsOptimism = false
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export async function getCost ({ billingType }) { export async function getCost ({ billingType }) {
return satsToMsats(TERRITORY_PERIOD_COST(billingType)) return satsToMsats(TERRITORY_PERIOD_COST(billingType))
@ -37,7 +32,6 @@ export async function perform ({ name, invoiceId, ...data }, { me, cost, tx }) {
if (sub.userId !== me.id) { if (sub.userId !== me.id) {
await tx.territoryTransfer.create({ data: { subName: name, oldUserId: sub.userId, newUserId: me.id } }) await tx.territoryTransfer.create({ data: { subName: name, oldUserId: sub.userId, newUserId: me.id } })
await tx.subSubscription.delete({ where: { userId_subName: { userId: sub.userId, subName: name } } })
} }
await tx.subAct.create({ await tx.subAct.create({
@ -49,24 +43,7 @@ export async function perform ({ name, invoiceId, ...data }, { me, cost, tx }) {
} }
}) })
await tx.subSubscription.upsert({ return await tx.sub.update({
where: {
userId_subName: {
userId: me.id,
subName: name
}
},
update: {
userId: me.id,
subName: name
},
create: {
userId: me.id,
subName: name
}
})
const updatedSub = await tx.sub.update({
data, data,
// optimistic concurrency control // optimistic concurrency control
// make sure none of the relevant fields have changed since we fetched the sub // make sure none of the relevant fields have changed since we fetched the sub
@ -77,12 +54,6 @@ export async function perform ({ name, invoiceId, ...data }, { me, cost, tx }) {
} }
} }
}) })
await tx.userSubTrust.createMany({
data: initialTrust({ name: updatedSub.name, userId: updatedSub.userId })
})
return updatedSub
} }
export async function describe ({ name }, context) { export async function describe ({ name }, context) {

View File

@ -1,15 +1,11 @@
import { PAID_ACTION_PAYMENT_METHODS, TERRITORY_PERIOD_COST } from '@/lib/constants' import { TERRITORY_PERIOD_COST } from '@/lib/constants'
import { satsToMsats } from '@/lib/format' import { satsToMsats } from '@/lib/format'
import { proratedBillingCost } from '@/lib/territory' import { proratedBillingCost } from '@/lib/territory'
import { datePivot } from '@/lib/time' import { datePivot } from '@/lib/time'
export const anonable = false export const anonable = false
export const supportsPessimism = true
export const paymentMethods = [ export const supportsOptimism = false
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export async function getCost ({ oldName, billingType }, { models }) { export async function getCost ({ oldName, billingType }, { models }) {
const oldSub = await models.sub.findUnique({ const oldSub = await models.sub.findUnique({

View File

@ -1,63 +1,17 @@
import { PAID_ACTION_PAYMENT_METHODS, USER_ID } from '@/lib/constants' import { USER_ID } from '@/lib/constants'
import { msatsToSats, satsToMsats } from '@/lib/format' import { msatsToSats, satsToMsats } from '@/lib/format'
import { notifyZapped } from '@/lib/webPush' import { notifyZapped } from '@/lib/webPush'
import { getInvoiceableWallets } from '@/wallets/server'
import { Prisma } from '@prisma/client'
export const anonable = true export const anonable = true
export const supportsPessimism = true
export const paymentMethods = [ export const supportsOptimism = true
PAID_ACTION_PAYMENT_METHODS.P2P,
PAID_ACTION_PAYMENT_METHODS.FEE_CREDIT,
PAID_ACTION_PAYMENT_METHODS.REWARD_SATS,
PAID_ACTION_PAYMENT_METHODS.OPTIMISTIC,
PAID_ACTION_PAYMENT_METHODS.PESSIMISTIC
]
export async function getCost ({ sats }) { export async function getCost ({ sats }) {
return satsToMsats(sats) return satsToMsats(sats)
} }
export async function getInvoiceablePeer ({ id, sats, hasSendWallet }, { models, me, cost }) { export async function perform ({ invoiceId, sats, id: itemId, ...args }, { me, cost, tx }) {
// if the zap is dust, or if me doesn't have a send wallet but has enough sats/credits to pay for it const feeMsats = cost / BigInt(10) // 10% fee
// then we don't invoice the peer
if (sats < me?.sendCreditsBelowSats ||
(me && !hasSendWallet && (me.mcredits >= cost || me.msats >= cost))) {
return null
}
const item = await models.item.findUnique({
where: { id: parseInt(id) },
include: {
itemForwards: true,
user: true
}
})
// bios don't get sats
if (item.bio) {
return null
}
const wallets = await getInvoiceableWallets(item.userId, { models })
// request peer invoice if they have an attached wallet and have not forwarded the item
// and the receiver doesn't want to receive credits
if (wallets.length > 0 &&
item.itemForwards.length === 0 &&
sats >= item.user.receiveCreditsBelowSats) {
return item.userId
}
return null
}
export async function getSybilFeePercent () {
return 30n
}
export async function perform ({ invoiceId, sats, id: itemId, ...args }, { me, cost, sybilFeePercent, tx }) {
const feeMsats = cost * sybilFeePercent / 100n
const zapMsats = cost - feeMsats const zapMsats = cost - feeMsats
itemId = parseInt(itemId) itemId = parseInt(itemId)
@ -93,7 +47,7 @@ export async function retry ({ invoiceId, newInvoiceId }, { tx, cost }) {
return { id, sats: msatsToSats(cost), act: 'TIP', path } return { id, sats: msatsToSats(cost), act: 'TIP', path }
} }
export async function onPaid ({ invoice, actIds }, { tx }) { export async function onPaid ({ invoice, actIds }, { models, tx }) {
let acts let acts
if (invoice) { if (invoice) {
await tx.itemAct.updateMany({ await tx.itemAct.updateMany({
@ -114,58 +68,34 @@ export async function onPaid ({ invoice, actIds }, { tx }) {
const sats = msatsToSats(msats) const sats = msatsToSats(msats)
const itemAct = acts.find(act => act.act === 'TIP') const itemAct = acts.find(act => act.act === 'TIP')
if (invoice?.invoiceForward) { // give user and all forwards the sats
// only the op got sats and we need to add it to their stackedMsats await tx.$executeRaw`
// because the sats were p2p WITH forwardees AS (
await tx.user.update({ SELECT "userId", ((${itemAct.msats}::BIGINT * pct) / 100)::BIGINT AS msats
where: { id: itemAct.item.userId }, FROM "ItemForward"
data: { stackedMsats: { increment: itemAct.msats } } WHERE "itemId" = ${itemAct.itemId}::INTEGER
}) ), total_forwarded AS (
} else { SELECT COALESCE(SUM(msats), 0) as msats
// splits only use mcredits FROM forwardees
await tx.$executeRaw` ), forward AS (
WITH forwardees AS (
SELECT "userId", ((${itemAct.msats}::BIGINT * pct) / 100)::BIGINT AS mcredits
FROM "ItemForward"
WHERE "itemId" = ${itemAct.itemId}::INTEGER
), total_forwarded AS (
SELECT COALESCE(SUM(mcredits), 0) as mcredits
FROM forwardees
), recipients AS (
SELECT "userId", mcredits FROM forwardees
UNION
SELECT ${itemAct.item.userId}::INTEGER as "userId",
${itemAct.msats}::BIGINT - (SELECT mcredits FROM total_forwarded)::BIGINT as mcredits
ORDER BY "userId" ASC -- order to prevent deadlocks
)
UPDATE users UPDATE users
SET SET
mcredits = users.mcredits + recipients.mcredits, msats = users.msats + forwardees.msats,
"stackedMsats" = users."stackedMsats" + recipients.mcredits, "stackedMsats" = users."stackedMsats" + forwardees.msats
"stackedMcredits" = users."stackedMcredits" + recipients.mcredits FROM forwardees
FROM recipients WHERE users.id = forwardees."userId"
WHERE users.id = recipients."userId"` )
} UPDATE users
SET
msats = msats + ${itemAct.msats}::BIGINT - (SELECT msats FROM total_forwarded)::BIGINT,
"stackedMsats" = "stackedMsats" + ${itemAct.msats}::BIGINT - (SELECT msats FROM total_forwarded)::BIGINT
WHERE id = ${itemAct.item.userId}::INTEGER`
// perform denomormalized aggregates: weighted votes, upvotes, msats, lastZapAt // perform denomormalized aggregates: weighted votes, upvotes, msats, lastZapAt
// NOTE: for the rows that might be updated by a concurrent zap, we use UPDATE for implicit locking // NOTE: for the rows that might be updated by a concurrent zap, we use UPDATE for implicit locking
await tx.$queryRaw` const [item] = await tx.$queryRaw`
WITH territory AS ( WITH zapper AS (
SELECT COALESCE(r."subName", i."subName", 'meta')::TEXT as "subName" SELECT trust FROM users WHERE id = ${itemAct.userId}::INTEGER
FROM "Item" i
LEFT JOIN "Item" r ON r.id = i."rootId"
WHERE i.id = ${itemAct.itemId}::INTEGER
), zapper AS (
SELECT
COALESCE(${itemAct.item.parentId
? Prisma.sql`"zapCommentTrust"`
: Prisma.sql`"zapPostTrust"`}, 0) as "zapTrust",
COALESCE(${itemAct.item.parentId
? Prisma.sql`"subZapCommentTrust"`
: Prisma.sql`"subZapPostTrust"`}, 0) as "subZapTrust"
FROM territory
LEFT JOIN "UserSubTrust" ust ON ust."subName" = territory."subName"
AND ust."userId" = ${itemAct.userId}::INTEGER
), zap AS ( ), zap AS (
INSERT INTO "ItemUserAgg" ("userId", "itemId", "zapSats") INSERT INTO "ItemUserAgg" ("userId", "itemId", "zapSats")
VALUES (${itemAct.userId}::INTEGER, ${itemAct.itemId}::INTEGER, ${sats}::INTEGER) VALUES (${itemAct.userId}::INTEGER, ${itemAct.itemId}::INTEGER, ${sats}::INTEGER)
@ -173,30 +103,16 @@ export async function onPaid ({ invoice, actIds }, { tx }) {
SET "zapSats" = "ItemUserAgg"."zapSats" + ${sats}::INTEGER, updated_at = now() SET "zapSats" = "ItemUserAgg"."zapSats" + ${sats}::INTEGER, updated_at = now()
RETURNING ("zapSats" = ${sats}::INTEGER)::INTEGER as first_vote, RETURNING ("zapSats" = ${sats}::INTEGER)::INTEGER as first_vote,
LOG("zapSats" / GREATEST("zapSats" - ${sats}::INTEGER, 1)::FLOAT) AS log_sats LOG("zapSats" / GREATEST("zapSats" - ${sats}::INTEGER, 1)::FLOAT) AS log_sats
), item_zapped AS (
UPDATE "Item"
SET
"weightedVotes" = "weightedVotes" + zapper."zapTrust" * zap.log_sats,
"subWeightedVotes" = "subWeightedVotes" + zapper."subZapTrust" * zap.log_sats,
upvotes = upvotes + zap.first_vote,
msats = "Item".msats + ${msats}::BIGINT,
mcredits = "Item".mcredits + ${invoice?.invoiceForward ? 0n : msats}::BIGINT,
"lastZapAt" = now()
FROM zap, zapper
WHERE "Item".id = ${itemAct.itemId}::INTEGER
RETURNING "Item".*, zapper."zapTrust" * zap.log_sats as "weightedVote"
), ancestors AS (
SELECT "Item".*
FROM "Item", item_zapped
WHERE "Item".path @> item_zapped.path AND "Item".id <> item_zapped.id
ORDER BY "Item".id
) )
UPDATE "Item" UPDATE "Item"
SET "weightedComments" = "Item"."weightedComments" + item_zapped."weightedVote", SET
"commentMsats" = "Item"."commentMsats" + ${msats}::BIGINT, "weightedVotes" = "weightedVotes" + (zapper.trust * zap.log_sats),
"commentMcredits" = "Item"."commentMcredits" + ${invoice?.invoiceForward ? 0n : msats}::BIGINT upvotes = upvotes + zap.first_vote,
FROM item_zapped, ancestors msats = "Item".msats + ${msats}::BIGINT,
WHERE "Item".id = ancestors.id` "lastZapAt" = now()
FROM zap, zapper
WHERE "Item".id = ${itemAct.itemId}::INTEGER
RETURNING "Item".*`
// record potential bounty payment // record potential bounty payment
// NOTE: we are at least guaranteed that we see the update "ItemUserAgg" from our tx so we can trust // NOTE: we are at least guaranteed that we see the update "ItemUserAgg" from our tx so we can trust
@ -216,24 +132,18 @@ export async function onPaid ({ invoice, actIds }, { tx }) {
SET "bountyPaidTo" = array_remove(array_append(array_remove("bountyPaidTo", bounty.target), bounty.target), NULL) SET "bountyPaidTo" = array_remove(array_append(array_remove("bountyPaidTo", bounty.target), bounty.target), NULL)
FROM bounty FROM bounty
WHERE "Item".id = bounty.id AND bounty.paid` WHERE "Item".id = bounty.id AND bounty.paid`
}
export async function nonCriticalSideEffects ({ invoice, actIds }, { models }) { // update commentMsats on ancestors
const itemAct = await models.itemAct.findFirst({ await tx.$executeRaw`
where: invoice ? { invoiceId: invoice.id } : { id: { in: actIds } }, WITH zapped AS (
include: { item: true } SELECT * FROM "Item" WHERE id = ${itemAct.itemId}::INTEGER
}) )
// avoid duplicate notifications with the same zap amount UPDATE "Item"
// by checking if there are any other pending acts on the item SET "commentMsats" = "Item"."commentMsats" + ${msats}::BIGINT
const pendingActs = await models.itemAct.count({ FROM zapped
where: { WHERE "Item".path @> zapped.path AND "Item".id <> zapped.id`
itemId: itemAct.itemId,
createdAt: { notifyZapped({ models, item }).catch(console.error)
gt: itemAct.createdAt
}
}
})
if (pendingActs === 0) notifyZapped({ models, item: itemAct.item }).catch(console.error)
} }
export async function onFail ({ invoice }, { tx }) { export async function onFail ({ invoice }, { tx }) {

View File

@ -1,64 +0,0 @@
import { LND_PATHFINDING_TIME_PREF_PPM, LND_PATHFINDING_TIMEOUT_MS } from '@/lib/constants'
import { msatsToSats, satsToMsats, toPositiveBigInt } from '@/lib/format'
import { Prisma } from '@prisma/client'
import { parsePaymentRequest, payViaPaymentRequest } from 'ln-service'
// paying actions are completely distinct from paid actions
// and there's only one paying action: send
// ... still we want the api to at least be similar
export default async function performPayingAction ({ bolt11, maxFee, walletId }, { me, models, lnd }) {
try {
console.group('performPayingAction', `${bolt11.slice(0, 10)}...`, maxFee, walletId)
if (!me) {
throw new Error('You must be logged in to perform this action')
}
const decoded = await parsePaymentRequest({ request: bolt11 })
const cost = toPositiveBigInt(toPositiveBigInt(decoded.mtokens) + satsToMsats(maxFee))
console.log('cost', cost)
const withdrawal = await models.$transaction(async tx => {
await tx.user.update({
where: {
id: me.id
},
data: { msats: { decrement: cost } }
})
return await tx.withdrawl.create({
data: {
hash: decoded.id,
bolt11,
msatsPaying: toPositiveBigInt(decoded.mtokens),
msatsFeePaying: satsToMsats(maxFee),
userId: me.id,
walletId,
autoWithdraw: !!walletId
}
})
}, { isolationLevel: Prisma.TransactionIsolationLevel.ReadCommitted })
payViaPaymentRequest({
lnd,
request: withdrawal.bolt11,
max_fee: msatsToSats(withdrawal.msatsFeePaying),
pathfinding_timeout: LND_PATHFINDING_TIMEOUT_MS,
confidence: LND_PATHFINDING_TIME_PREF_PPM
}).catch(console.error)
return withdrawal
} catch (e) {
if (e.message.includes('\\"users\\" violates check constraint \\"msats_positive\\"')) {
throw new Error('insufficient funds')
}
if (e instanceof Prisma.PrismaClientKnownRequestError && e.code === 'P2002') {
throw new Error('you cannot withdraw to the same invoice twice')
}
console.error('performPayingAction failed', e)
throw e
} finally {
console.groupEnd()
}
}

View File

@ -1,5 +1,3 @@
import { SN_ADMIN_IDS } from '@/lib/constants'
export default { export default {
Query: { Query: {
snl: async (parent, _, { models }) => { snl: async (parent, _, { models }) => {
@ -9,7 +7,7 @@ export default {
}, },
Mutation: { Mutation: {
onAirToggle: async (parent, _, { models, me }) => { onAirToggle: async (parent, _, { models, me }) => {
if (!me || !SN_ADMIN_IDS.includes(me.id)) { if (me.id !== 616) {
throw new Error('not an admin') throw new Error('not an admin')
} }
const { id, live } = await models.snl.findFirst() const { id, live } = await models.snl.findFirst()

View File

@ -1,7 +1,7 @@
import { GqlAuthorizationError } from '@/lib/error' import { GraphQLError } from 'graphql'
export default function assertApiKeyNotPermitted ({ me }) { export default function assertApiKeyNotPermitted ({ me }) {
if (me?.apiKey === true) { if (me?.apiKey === true) {
throw new GqlAuthorizationError('this operation is not allowed to be performed via API Key') throw new GraphQLError('this operation is not allowed to be performed via API Key', { extensions: { code: 'FORBIDDEN' } })
} }
} }

View File

@ -1,27 +1,37 @@
import { isServiceEnabled } from '@/lib/sndev' import lndService from 'ln-service'
import { cachedFetcher } from '@/lib/fetch' import lnd from '@/api/lnd'
import { getHeight } from 'ln-service'
const getBlockHeight = cachedFetcher(async function fetchBlockHeight ({ lnd }) { const cache = new Map()
const expiresIn = 1000 * 30 // 30 seconds in milliseconds
async function fetchBlockHeight () {
let blockHeight = 0
try { try {
const { current_block_height: height } = await getHeight({ lnd }) const height = await lndService.getHeight({ lnd })
return height blockHeight = height.current_block_height
} catch (err) { } catch (err) {
console.error('getBlockHeight', err) console.error('fetchBlockHeight', err)
return 0
} }
}, { cache.set('block', { height: blockHeight, createdAt: Date.now() })
maxSize: 1, return blockHeight
cacheExpiry: 60 * 1000, // 1 minute }
forceRefreshThreshold: 0,
keyGenerator: () => 'getBlockHeight' async function getBlockHeight () {
}) if (cache.has('block')) {
const { height, createdAt } = cache.get('block')
const expired = createdAt + expiresIn < Date.now()
if (expired) fetchBlockHeight().catch(console.error) // update cache
return height // serve stale block height (this on the SSR critical path)
} else {
fetchBlockHeight().catch(console.error)
}
return 0
}
export default { export default {
Query: { Query: {
blockHeight: async (parent, opts, { lnd }) => { blockHeight: async (parent, opts, ctx) => {
if (!isServiceEnabled('payments')) return 0 return await getBlockHeight()
return await getBlockHeight({ lnd }) || 0
} }
} }
} }

View File

@ -1,26 +1,36 @@
import { cachedFetcher } from '@/lib/fetch' const cache = new Map()
const expiresIn = 1000 * 30 // 30 seconds in milliseconds
const getChainFeeRate = cachedFetcher(async function fetchChainFeeRate () { async function fetchChainFeeRate () {
const url = 'https://mempool.space/api/v1/fees/recommended' const url = 'https://mempool.space/api/v1/fees/recommended'
try { const chainFee = await fetch(url)
const res = await fetch(url) .then((res) => res.json())
const body = await res.json() .then((body) => body.hourFee)
return body.hourFee .catch((err) => {
} catch (err) { console.error('fetchChainFee', err)
console.error('fetchChainFee', err) return 0
return 0 })
cache.set('fee', { fee: chainFee, createdAt: Date.now() })
return chainFee
}
async function getChainFeeRate () {
if (cache.has('fee')) {
const { fee, createdAt } = cache.get('fee')
const expired = createdAt + expiresIn < Date.now()
if (expired) fetchChainFeeRate().catch(console.error) // update cache
return fee
} else {
fetchChainFeeRate().catch(console.error)
} }
}, { return 0
maxSize: 1, }
cacheExpiry: 60 * 1000, // 1 minute
forceRefreshThreshold: 0, // never force refresh
keyGenerator: () => 'getChainFeeRate'
})
export default { export default {
Query: { Query: {
chainFee: async (parent, opts, ctx) => { chainFee: async (parent, opts, ctx) => {
return await getChainFeeRate() || 0 return await getChainFeeRate()
} }
} }
} }

View File

@ -121,39 +121,6 @@ export default {
FROM ${viewGroup(range, 'stacking_growth')} FROM ${viewGroup(range, 'stacking_growth')}
GROUP BY time GROUP BY time
ORDER BY time ASC`, ...range) ORDER BY time ASC`, ...range)
},
itemGrowthSubs: async (parent, { when, to, from, sub }, { models }) => {
const range = whenRange(when, from, to)
const subExists = await models.sub.findUnique({ where: { name: sub } })
if (!subExists) throw new Error('Sub not found')
return await models.$queryRawUnsafe(`
SELECT date_trunc('${timeUnitForRange(range)}', t) at time zone 'America/Chicago' as time, json_build_array(
json_build_object('name', 'posts', 'value', coalesce(sum(posts),0)),
json_build_object('name', 'comments', 'value', coalesce(sum(comments),0))
) AS data
FROM ${viewGroup(range, 'sub_stats')}
WHERE sub_name = $3
GROUP BY time
ORDER BY time ASC`, ...range, sub)
},
revenueGrowthSubs: async (parent, { when, to, from, sub }, { models }) => {
const range = whenRange(when, from, to)
const subExists = await models.sub.findUnique({ where: { name: sub } })
if (!subExists) throw new Error('Sub not found')
return await models.$queryRawUnsafe(`
SELECT date_trunc('${timeUnitForRange(range)}', t) at time zone 'America/Chicago' as time, json_build_array(
json_build_object('name', 'revenue', 'value', coalesce(sum(msats_revenue/1000),0)),
json_build_object('name', 'stacking', 'value', coalesce(sum(msats_stacked/1000),0)),
json_build_object('name', 'spending', 'value', coalesce(sum(msats_spent/1000),0))
) AS data
FROM ${viewGroup(range, 'sub_stats')}
WHERE sub_name = $3
GROUP BY time
ORDER BY time ASC`, ...range, sub)
} }
} }
} }

25
api/resolvers/image.js Normal file
View File

@ -0,0 +1,25 @@
import { USER_ID, AWS_S3_URL_REGEXP } from '@/lib/constants'
import { msatsToSats } from '@/lib/format'
export default {
Query: {
imageFeesInfo: async (parent, { s3Keys }, { models, me }) => {
return imageFeesInfo(s3Keys, { models, me })
}
}
}
export function uploadIdsFromText (text, { models }) {
if (!text) return []
return [...new Set([...text.matchAll(AWS_S3_URL_REGEXP)].map(m => Number(m[1])))]
}
export async function imageFeesInfo (s3Keys, { models, me }) {
// returns info object in this format:
// { bytes24h: int, bytesUnpaid: int, nUnpaid: int, imageFeeMsats: BigInt }
const [info] = await models.$queryRawUnsafe('SELECT * FROM image_fees_info($1::INTEGER, $2::INTEGER[])', me ? me.id : USER_ID.anon, s3Keys)
const imageFee = msatsToSats(info.imageFeeMsats)
const totalFeesMsats = info.nUnpaid * Number(info.imageFeeMsats)
const totalFees = msatsToSats(totalFeesMsats)
return { ...info, imageFee, totalFees, totalFeesMsats }
}

View File

@ -16,10 +16,10 @@ import { GraphQLJSONObject as JSONObject } from 'graphql-type-json'
import admin from './admin' import admin from './admin'
import blockHeight from './blockHeight' import blockHeight from './blockHeight'
import chainFee from './chainFee' import chainFee from './chainFee'
import image from './image'
import { GraphQLScalarType, Kind } from 'graphql' import { GraphQLScalarType, Kind } from 'graphql'
import { createIntScalar } from 'graphql-scalar' import { createIntScalar } from 'graphql-scalar'
import paidAction from './paidAction' import paidAction from './paidAction'
import vault from './vault'
const date = new GraphQLScalarType({ const date = new GraphQLScalarType({
name: 'Date', name: 'Date',
@ -56,4 +56,4 @@ const limit = createIntScalar({
export default [user, item, message, wallet, lnurl, notifications, invite, sub, export default [user, item, message, wallet, lnurl, notifications, invite, sub,
upload, search, growth, rewards, referrals, price, admin, blockHeight, chainFee, upload, search, growth, rewards, referrals, price, admin, blockHeight, chainFee,
{ JSONObject }, { Date: date }, { Limit: limit }, paidAction, vault] image, { JSONObject }, { Date: date }, { Limit: limit }, paidAction]

View File

@ -1,15 +1,15 @@
import { inviteSchema, validateSchema } from '@/lib/validate' import { GraphQLError } from 'graphql'
import { inviteSchema, ssValidate } from '@/lib/validate'
import { msatsToSats } from '@/lib/format' import { msatsToSats } from '@/lib/format'
import assertApiKeyNotPermitted from './apiKey' import assertApiKeyNotPermitted from './apiKey'
import { GqlAuthenticationError, GqlInputError } from '@/lib/error'
import { Prisma } from '@prisma/client'
export default { export default {
Query: { Query: {
invites: async (parent, args, { me, models }) => { invites: async (parent, args, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'FORBIDDEN' } })
} }
return await models.invite.findMany({ return await models.invite.findMany({
where: { where: {
userId: me.id userId: me.id
@ -29,48 +29,27 @@ export default {
}, },
Mutation: { Mutation: {
createInvite: async (parent, { id, gift, limit, description }, { me, models }) => { createInvite: async (parent, { gift, limit }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'FORBIDDEN' } })
} }
assertApiKeyNotPermitted({ me }) assertApiKeyNotPermitted({ me })
await validateSchema(inviteSchema, { id, gift, limit, description }) await ssValidate(inviteSchema, { gift, limit })
try {
return await models.invite.create({ return await models.invite.create({
data: { data: { gift, limit, userId: me.id }
id, })
gift,
limit,
userId: me.id,
description
}
})
} catch (error) {
if (error instanceof Prisma.PrismaClientKnownRequestError) {
if (error.code === 'P2002' && error.meta.target.includes('id')) {
throw new GqlInputError('an invite with this code already exists')
}
}
throw error
}
}, },
revokeInvite: async (parent, { id }, { me, models }) => { revokeInvite: async (parent, { id }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'FORBIDDEN' } })
} }
try { return await models.invite.update({
return await models.invite.update({ where: { id },
where: { id, userId: me.id }, data: { revoked: true }
data: { revoked: true } })
})
} catch (err) {
if (err.code === 'P2025') {
throw new GqlInputError('invite not found')
}
throw err
}
} }
}, },
@ -83,10 +62,7 @@ export default {
}, },
poor: async (invite, args, { me, models }) => { poor: async (invite, args, { me, models }) => {
const user = await models.user.findUnique({ where: { id: invite.userId } }) const user = await models.user.findUnique({ where: { id: invite.userId } })
return msatsToSats(user.msats) < invite.gift && msatsToSats(user.mcredits) < invite.gift return msatsToSats(user.msats) < invite.gift
},
description: (invite, args, { me }) => {
return invite.userId === me?.id ? invite.description : undefined
} }
} }
} }

File diff suppressed because it is too large Load Diff

View File

@ -1,8 +1,8 @@
import { randomBytes } from 'crypto' import { randomBytes } from 'crypto'
import { bech32 } from 'bech32' import { bech32 } from 'bech32'
import { GraphQLError } from 'graphql'
import assertGofacYourself from './ofac' import assertGofacYourself from './ofac'
import assertApiKeyNotPermitted from './apiKey' import assertApiKeyNotPermitted from './apiKey'
import { GqlAuthenticationError } from '@/lib/error'
function encodedUrl (iurl, tag, k1) { function encodedUrl (iurl, tag, k1) {
const url = new URL(iurl) const url = new URL(iurl)
@ -35,7 +35,7 @@ export default {
await assertGofacYourself({ models, headers }) await assertGofacYourself({ models, headers })
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
assertApiKeyNotPermitted({ me }) assertApiKeyNotPermitted({ me })

View File

@ -1,4 +1,4 @@
import { GqlInputError } from '@/lib/error' import { GraphQLError } from 'graphql'
export default { export default {
Query: { Query: {
@ -11,7 +11,7 @@ export default {
Mutation: { Mutation: {
createMessage: async (parent, { text }, { me, models }) => { createMessage: async (parent, { text }, { me, models }) => {
if (!text) { if (!text) {
throw new GqlInputError('must have text') throw new GraphQLError('Must have text', { extensions: { code: 'BAD_INPUT' } })
} }
return await models.message.create({ return await models.message.create({

View File

@ -1,18 +1,17 @@
import { GraphQLError } from 'graphql'
import { decodeCursor, LIMIT, nextNoteCursorEncoded } from '@/lib/cursor' import { decodeCursor, LIMIT, nextNoteCursorEncoded } from '@/lib/cursor'
import { getItem, filterClause, whereClause, muteClause, activeOrMine } from './item' import { getItem, filterClause, whereClause, muteClause, activeOrMine } from './item'
import { getInvoice, getWithdrawl } from './wallet' import { getInvoice, getWithdrawl } from './wallet'
import { pushSubscriptionSchema, validateSchema } from '@/lib/validate' import { pushSubscriptionSchema, ssValidate } from '@/lib/validate'
import { replyToSubscription } from '@/lib/webPush' import { replyToSubscription } from '@/lib/webPush'
import { getSub } from './sub' import { getSub } from './sub'
import { GqlAuthenticationError, GqlInputError } from '@/lib/error'
import { WALLET_MAX_RETRIES, WALLET_RETRY_BEFORE_MS } from '@/lib/constants'
export default { export default {
Query: { Query: {
notifications: async (parent, { cursor, inc }, { me, models }) => { notifications: async (parent, { cursor, inc }, { me, models }) => {
const decodedCursor = decodeCursor(cursor) const decodedCursor = decodeCursor(cursor)
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const meFull = await models.user.findUnique({ where: { id: me.id } }) const meFull = await models.user.findUnique({ where: { id: me.id } })
@ -180,6 +179,17 @@ export default {
)` )`
) )
queries.push(
`(SELECT "Item".id::text, "Item"."statusUpdatedAt" AS "sortTime", NULL as "earnedSats",
'JobChanged' AS type
FROM "Item"
WHERE "Item"."userId" = $1
AND "maxBid" IS NOT NULL
AND "statusUpdatedAt" < $2 AND "statusUpdatedAt" <> created_at
ORDER BY "sortTime" DESC
LIMIT ${LIMIT})`
)
// territory transfers // territory transfers
queries.push( queries.push(
`(SELECT "TerritoryTransfer".id::text, "TerritoryTransfer"."created_at" AS "sortTime", NULL as "earnedSats", `(SELECT "TerritoryTransfer".id::text, "TerritoryTransfer"."created_at" AS "sortTime", NULL as "earnedSats",
@ -218,20 +228,14 @@ export default {
if (meFull.noteDeposits) { if (meFull.noteDeposits) {
queries.push( queries.push(
`(SELECT "Invoice".id::text, "Invoice"."confirmedAt" AS "sortTime", `(SELECT "Invoice".id::text, "Invoice"."confirmedAt" AS "sortTime", FLOOR("msatsReceived" / 1000) as "earnedSats",
FLOOR("Invoice"."msatsReceived" / 1000) as "earnedSats",
'InvoicePaid' AS type 'InvoicePaid' AS type
FROM "Invoice" FROM "Invoice"
WHERE "Invoice"."userId" = $1 WHERE "Invoice"."userId" = $1
AND "Invoice"."confirmedAt" IS NOT NULL AND "confirmedAt" IS NOT NULL
AND "Invoice"."created_at" < $2 AND "isHeld" IS NULL
AND ( AND "actionState" IS NULL
("Invoice"."isHeld" IS NULL AND "Invoice"."actionType" IS NULL) AND created_at < $2
OR (
"Invoice"."actionType" = 'RECEIVE'
AND "Invoice"."actionState" = 'PAID'
)
)
ORDER BY "sortTime" DESC ORDER BY "sortTime" DESC
LIMIT ${LIMIT})` LIMIT ${LIMIT})`
) )
@ -239,17 +243,12 @@ export default {
if (meFull.noteWithdrawals) { if (meFull.noteWithdrawals) {
queries.push( queries.push(
`(SELECT "Withdrawl".id::text, MAX(COALESCE("Invoice"."confirmedAt", "Withdrawl".created_at)) AS "sortTime", `(SELECT "Withdrawl".id::text, "Withdrawl".created_at AS "sortTime", FLOOR("msatsPaid" / 1000) as "earnedSats",
FLOOR(MAX("Withdrawl"."msatsPaid" / 1000)) as "earnedSats",
'WithdrawlPaid' AS type 'WithdrawlPaid' AS type
FROM "Withdrawl" FROM "Withdrawl"
LEFT JOIN "InvoiceForward" ON "InvoiceForward"."withdrawlId" = "Withdrawl".id
LEFT JOIN "Invoice" ON "InvoiceForward"."invoiceId" = "Invoice".id
WHERE "Withdrawl"."userId" = $1 WHERE "Withdrawl"."userId" = $1
AND "Withdrawl".status = 'CONFIRMED' AND status = 'CONFIRMED'
AND "Withdrawl".created_at < $2 AND created_at < $2
AND "InvoiceForward"."id" IS NULL
GROUP BY "Withdrawl".id
ORDER BY "sortTime" DESC ORDER BY "sortTime" DESC
LIMIT ${LIMIT})` LIMIT ${LIMIT})`
) )
@ -346,31 +345,16 @@ export default {
) )
queries.push( queries.push(
`(SELECT "Invoice".id::text, `(SELECT "Invoice".id::text, "Invoice"."updated_at" AS "sortTime", NULL as "earnedSats", 'Invoicification' AS type
CASE
WHEN
"Invoice"."paymentAttempt" < ${WALLET_MAX_RETRIES}
AND "Invoice"."userCancel" = false
AND "Invoice"."cancelledAt" <= now() - interval '${`${WALLET_RETRY_BEFORE_MS} milliseconds`}'
THEN "Invoice"."cancelledAt" + interval '${`${WALLET_RETRY_BEFORE_MS} milliseconds`}'
ELSE "Invoice"."updated_at"
END AS "sortTime", NULL as "earnedSats", 'Invoicification' AS type
FROM "Invoice" FROM "Invoice"
WHERE "Invoice"."userId" = $1 WHERE "Invoice"."userId" = $1
AND "Invoice"."updated_at" < $2 AND "Invoice"."updated_at" < $2
AND "Invoice"."actionState" = 'FAILED' AND "Invoice"."actionState" = 'FAILED'
AND (
-- this is the inverse of the filter for automated retries
"Invoice"."paymentAttempt" >= ${WALLET_MAX_RETRIES}
OR "Invoice"."userCancel" = true
OR "Invoice"."cancelledAt" <= now() - interval '${`${WALLET_RETRY_BEFORE_MS} milliseconds`}'
)
AND ( AND (
"Invoice"."actionType" = 'ITEM_CREATE' OR "Invoice"."actionType" = 'ITEM_CREATE' OR
"Invoice"."actionType" = 'ZAP' OR "Invoice"."actionType" = 'ZAP' OR
"Invoice"."actionType" = 'DOWN_ZAP' OR "Invoice"."actionType" = 'DOWN_ZAP' OR
"Invoice"."actionType" = 'POLL_VOTE' OR "Invoice"."actionType" = 'POLL_VOTE'
"Invoice"."actionType" = 'BOOST'
) )
ORDER BY "sortTime" DESC ORDER BY "sortTime" DESC
LIMIT ${LIMIT})` LIMIT ${LIMIT})`
@ -398,10 +382,10 @@ export default {
Mutation: { Mutation: {
savePushSubscription: async (parent, { endpoint, p256dh, auth, oldEndpoint }, { me, models }) => { savePushSubscription: async (parent, { endpoint, p256dh, auth, oldEndpoint }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
await validateSchema(pushSubscriptionSchema, { endpoint, p256dh, auth }) await ssValidate(pushSubscriptionSchema, { endpoint, p256dh, auth })
let dbPushSubscription let dbPushSubscription
if (oldEndpoint) { if (oldEndpoint) {
@ -422,12 +406,12 @@ export default {
}, },
deletePushSubscription: async (parent, { endpoint }, { me, models }) => { deletePushSubscription: async (parent, { endpoint }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const subscription = await models.pushSubscription.findFirst({ where: { endpoint, userId: Number(me.id) } }) const subscription = await models.pushSubscription.findFirst({ where: { endpoint, userId: Number(me.id) } })
if (!subscription) { if (!subscription) {
throw new GqlInputError('endpoint not found') throw new GraphQLError('endpoint not found', { extensions: { code: 'BAD_INPUT' } })
} }
const deletedSubscription = await models.pushSubscription.delete({ where: { id: subscription.id } }) const deletedSubscription = await models.pushSubscription.delete({ where: { id: subscription.id } })
console.log(`[webPush] deleted subscription ${deletedSubscription.id} of user ${deletedSubscription.userId} due to client request`) console.log(`[webPush] deleted subscription ${deletedSubscription.id} of user ${deletedSubscription.userId} due to client request`)
@ -482,24 +466,6 @@ export default {
return subAct.subName return subAct.subName
} }
}, },
ReferralSource: {
__resolveType: async (n, args, { models }) => n.type
},
Referral: {
source: async (n, args, { models, me }) => {
// retrieve the referee landing record
const referral = await models.oneDayReferral.findFirst({ where: { refereeId: Number(n.id), landing: true } })
if (!referral) return null // if no landing record, it will return a generic referral
switch (referral.type) {
case 'POST':
case 'COMMENT': return { ...await getItem(n, { id: referral.typeId }, { models, me }), type: 'Item' }
case 'TERRITORY': return { ...await getSub(n, { name: referral.typeId }, { models, me }), type: 'Sub' }
case 'PROFILE': return { ...await models.user.findUnique({ where: { id: Number(referral.typeId) }, select: { name: true } }), type: 'User' }
default: return null
}
}
},
Streak: { Streak: {
days: async (n, args, { models }) => { days: async (n, args, { models }) => {
const res = await models.$queryRaw` const res = await models.$queryRaw`
@ -509,14 +475,6 @@ export default {
` `
return res.length ? res[0].days : null return res.length ? res[0].days : null
},
type: async (n, args, { models }) => {
const res = await models.$queryRaw`
SELECT "type"
FROM "Streak"
WHERE id = ${Number(n.id)}
`
return res.length ? res[0].type : null
} }
}, },
Earn: { Earn: {

View File

@ -1,11 +1,13 @@
import { GqlAuthorizationError } from '@/lib/error' import { GraphQLError } from 'graphql'
// this function makes america more secure apparently // this function makes america more secure apparently
export default async function assertGofacYourself ({ models, headers, ip }) { export default async function assertGofacYourself ({ models, headers, ip }) {
const country = await gOFACYourself({ models, headers, ip }) const country = await gOFACYourself({ models, headers, ip })
if (!country) return if (!country) return
throw new GqlAuthorizationError(`Your IP address is in ${country}. We cannot provide financial services to residents of ${country}.`) throw new GraphQLError(
`Your IP address is in ${country}. We cannot provide financial services to residents of ${country}.`,
{ extensions: { code: 'FORBIDDEN' } })
} }
export async function gOFACYourself ({ models, headers = {}, ip }) { export async function gOFACYourself ({ models, headers = {}, ip }) {

View File

@ -1,5 +1,5 @@
import { retryPaidAction } from '../paidAction' import { retryPaidAction } from '../paidAction'
import { USER_ID, WALLET_MAX_RETRIES, WALLET_RETRY_TIMEOUT_MS } from '@/lib/constants' import { USER_ID } from '@/lib/constants'
function paidActionType (actionType) { function paidActionType (actionType) {
switch (actionType) { switch (actionType) {
@ -8,7 +8,6 @@ function paidActionType (actionType) {
return 'ItemPaidAction' return 'ItemPaidAction'
case 'ZAP': case 'ZAP':
case 'DOWN_ZAP': case 'DOWN_ZAP':
case 'BOOST':
return 'ItemActPaidAction' return 'ItemActPaidAction'
case 'TERRITORY_CREATE': case 'TERRITORY_CREATE':
case 'TERRITORY_UPDATE': case 'TERRITORY_UPDATE':
@ -19,10 +18,6 @@ function paidActionType (actionType) {
return 'DonatePaidAction' return 'DonatePaidAction'
case 'POLL_VOTE': case 'POLL_VOTE':
return 'PollVotePaidAction' return 'PollVotePaidAction'
case 'RECEIVE':
return 'ReceivePaidAction'
case 'BUY_CREDITS':
return 'BuyCreditsPaidAction'
default: default:
throw new Error('Unknown action type') throw new Error('Unknown action type')
} }
@ -31,12 +26,7 @@ function paidActionType (actionType) {
export default { export default {
Query: { Query: {
paidAction: async (parent, { invoiceId }, { models, me }) => { paidAction: async (parent, { invoiceId }, { models, me }) => {
const invoice = await models.invoice.findUnique({ const invoice = await models.invoice.findUnique({ where: { id: invoiceId, userId: me?.id ?? USER_ID.anon } })
where: {
id: invoiceId,
userId: me?.id ?? USER_ID.anon
}
})
if (!invoice) { if (!invoice) {
throw new Error('Invoice not found') throw new Error('Invoice not found')
} }
@ -45,37 +35,22 @@ export default {
type: paidActionType(invoice.actionType), type: paidActionType(invoice.actionType),
invoice, invoice,
result: invoice.actionResult, result: invoice.actionResult,
paymentMethod: invoice.actionOptimistic ? 'OPTIMISTIC' : 'PESSIMISTIC' paymentMethod: invoice.preimage ? 'PESSIMISTIC' : 'OPTIMISTIC'
} }
} }
}, },
Mutation: { Mutation: {
retryPaidAction: async (parent, { invoiceId, newAttempt }, { models, me, lnd }) => { retryPaidAction: async (parent, { invoiceId }, { models, me, lnd }) => {
if (!me) { if (!me) {
throw new Error('You must be logged in') throw new Error('You must be logged in')
} }
// make sure only one client at a time can retry by acquiring a lock that expires const invoice = await models.invoice.findUnique({ where: { id: invoiceId, userId: me.id } })
const [invoice] = await models.$queryRaw`
UPDATE "Invoice"
SET "retryPendingSince" = now()
WHERE
id = ${invoiceId} AND
"userId" = ${me.id} AND
"actionState" = 'FAILED' AND
("retryPendingSince" IS NULL OR "retryPendingSince" < now() - ${`${WALLET_RETRY_TIMEOUT_MS} milliseconds`}::interval)
RETURNING *`
if (!invoice) { if (!invoice) {
throw new Error('Invoice not found or retry pending') throw new Error('Invoice not found')
} }
// do we want to retry a payment from the beginning with all sender and receiver wallets? const result = await retryPaidAction(invoice.actionType, { invoiceId }, { models, me, lnd })
const paymentAttempt = newAttempt ? invoice.paymentAttempt + 1 : invoice.paymentAttempt
if (paymentAttempt > WALLET_MAX_RETRIES) {
throw new Error('Payment has been retried too many times')
}
const result = await retryPaidAction(invoice.actionType, { invoice }, { paymentAttempt, models, me, lnd })
return { return {
...result, ...result,

View File

@ -1,27 +1,36 @@
import { SUPPORTED_CURRENCIES } from '@/lib/currency' const cache = new Map()
import { cachedFetcher } from '@/lib/fetch' const expiresIn = 30000 // in milliseconds
const getPrice = cachedFetcher(async function fetchPrice (fiat = 'USD') { async function fetchPrice (fiat) {
const url = `https://api.coinbase.com/v2/prices/BTC-${fiat}/spot` const url = `https://api.coinbase.com/v2/prices/BTC-${fiat}/spot`
try { const price = await fetch(url)
const res = await fetch(url) .then((res) => res.json())
const body = await res.json() .then((body) => parseFloat(body.data.amount))
return parseFloat(body.data.amount) .catch((err) => {
} catch (err) { console.error(err)
console.error(err) return -1
return -1 })
cache.set(fiat, { price, createdAt: Date.now() })
return price
}
async function getPrice (fiat) {
fiat ??= 'USD'
if (cache.has(fiat)) {
const { price, createdAt } = cache.get(fiat)
const expired = createdAt + expiresIn < Date.now()
if (expired) fetchPrice(fiat).catch(console.error) // update cache
return price // serve stale price (this on the SSR critical path)
} else {
fetchPrice(fiat).catch(console.error)
} }
}, { return null
maxSize: SUPPORTED_CURRENCIES.length, }
cacheExpiry: 60 * 1000, // 1 minute
forceRefreshThreshold: 0, // never force refresh
keyGenerator: (fiat = 'USD') => fiat
})
export default { export default {
Query: { Query: {
price: async (parent, { fiatCurrency }, ctx) => { price: async (parent, { fiatCurrency }, ctx) => {
return await getPrice(fiatCurrency) || -1 return await getPrice(fiatCurrency)
} }
} }
} }

View File

@ -1,12 +1,12 @@
import { GraphQLError } from 'graphql'
import { timeUnitForRange, whenRange } from '@/lib/time' import { timeUnitForRange, whenRange } from '@/lib/time'
import { viewGroup } from './growth' import { viewGroup } from './growth'
import { GqlAuthenticationError } from '@/lib/error'
export default { export default {
Query: { Query: {
referrals: async (parent, { when, from, to }, { models, me }) => { referrals: async (parent, { when, from, to }, { models, me }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const range = whenRange(when, from, to) const range = whenRange(when, from, to)

View File

@ -1,8 +1,8 @@
import { amountSchema, validateSchema } from '@/lib/validate' import { GraphQLError } from 'graphql'
import { getAd, getItem } from './item' import { amountSchema, ssValidate } from '@/lib/validate'
import { getItem } from './item'
import { topUsers } from './user' import { topUsers } from './user'
import performPaidAction from '../paidAction' import performPaidAction from '../paidAction'
import { GqlInputError } from '@/lib/error'
let rewardCache let rewardCache
@ -63,21 +63,21 @@ async function getMonthlyRewards (when, models) {
async function getRewards (when, models) { async function getRewards (when, models) {
if (when) { if (when) {
if (when.length > 1) { if (when.length > 1) {
throw new GqlInputError('too many dates') throw new GraphQLError('too many dates', { extensions: { code: 'BAD_USER_INPUT' } })
} }
when.forEach(w => { when.forEach(w => {
if (isNaN(new Date(w))) { if (isNaN(new Date(w))) {
throw new GqlInputError('invalid date') throw new GraphQLError('invalid date', { extensions: { code: 'BAD_USER_INPUT' } })
} }
}) })
if (new Date(when[0]) > new Date(when[when.length - 1])) { if (new Date(when[0]) > new Date(when[when.length - 1])) {
throw new GqlInputError('bad date range') throw new GraphQLError('bad date range', { extensions: { code: 'BAD_USER_INPUT' } })
} }
if (new Date(when[0]).getTime() > new Date('2024-03-01').getTime() && new Date(when[0]).getTime() < new Date('2024-05-02').getTime()) { if (new Date(when[0]).getTime() > new Date('2024-03-01').getTime() && new Date(when[0]).getTime() < new Date('2024-05-02').getTime()) {
// after 3/1/2024 and until 5/1/2024, we reward monthly on the 1st // after 3/1/2024 and until 5/1/2024, we reward monthly on the 1st
if (new Date(when[0]).getUTCDate() !== 1) { if (new Date(when[0]).getUTCDate() !== 1) {
throw new GqlInputError('bad reward date') throw new GraphQLError('invalid reward date', { extensions: { code: 'BAD_USER_INPUT' } })
} }
return await getMonthlyRewards(when, models) return await getMonthlyRewards(when, models)
@ -119,11 +119,11 @@ export default {
} }
if (!when || when.length > 2) { if (!when || when.length > 2) {
throw new GqlInputError('bad date range') throw new GraphQLError('invalid date range', { extensions: { code: 'BAD_USER_INPUT' } })
} }
for (const w of when) { for (const w of when) {
if (isNaN(new Date(w))) { if (isNaN(new Date(w))) {
throw new GqlInputError('invalid date') throw new GraphQLError('invalid date', { extensions: { code: 'BAD_USER_INPUT' } })
} }
} }
@ -141,7 +141,6 @@ export default {
(SELECT FLOOR("Earn".msats / 1000.0) as sats, type, rank, "typeId" (SELECT FLOOR("Earn".msats / 1000.0) as sats, type, rank, "typeId"
FROM "Earn" FROM "Earn"
WHERE "Earn"."userId" = ${me.id} WHERE "Earn"."userId" = ${me.id}
AND (type IS NULL OR type NOT IN ('FOREVER_REFERRAL', 'ONE_DAY_REFERRAL'))
AND date_trunc('day', "Earn".created_at AT TIME ZONE 'UTC' AT TIME ZONE 'America/Chicago') = days_cte.day AND date_trunc('day', "Earn".created_at AT TIME ZONE 'UTC' AT TIME ZONE 'America/Chicago') = days_cte.day
ORDER BY "Earn".msats DESC) ORDER BY "Earn".msats DESC)
) "Earn" ) "Earn"
@ -157,21 +156,18 @@ export default {
const [{ to, from }] = await models.$queryRaw` const [{ to, from }] = await models.$queryRaw`
SELECT date_trunc('day', (now() AT TIME ZONE 'America/Chicago')) AT TIME ZONE 'America/Chicago' as from, SELECT date_trunc('day', (now() AT TIME ZONE 'America/Chicago')) AT TIME ZONE 'America/Chicago' as from,
(date_trunc('day', (now() AT TIME ZONE 'America/Chicago')) AT TIME ZONE 'America/Chicago') + interval '1 day - 1 second' as to` (date_trunc('day', (now() AT TIME ZONE 'America/Chicago')) AT TIME ZONE 'America/Chicago') + interval '1 day - 1 second' as to`
return await topUsers(parent, { when: 'custom', to: new Date(to).getTime().toString(), from: new Date(from).getTime().toString(), limit: 500 }, { models, ...context }) return await topUsers(parent, { when: 'custom', to: new Date(to).getTime().toString(), from: new Date(from).getTime().toString(), limit: 100 }, { models, ...context })
}, },
total: async (parent, args, { models }) => { total: async (parent, args, { models }) => {
if (!parent.total) { if (!parent.total) {
return 0 return 0
} }
return parent.total return parent.total
},
ad: async (parent, args, { me, models }) => {
return await getAd(parent, { }, { me, models })
} }
}, },
Mutation: { Mutation: {
donateToRewards: async (parent, { sats }, { me, models, lnd }) => { donateToRewards: async (parent, { sats }, { me, models, lnd }) => {
await validateSchema(amountSchema, { amount: sats }) await ssValidate(amountSchema, { amount: sats })
return await performPaidAction('DONATE', { sats }, { me, models, lnd }) return await performPaidAction('DONATE', { sats }, { me, models, lnd })
} }

View File

@ -174,6 +174,7 @@ export default {
search: async (parent, { q, cursor, sort, what, when, from: whenFrom, to: whenTo }, { me, models, search }) => { search: async (parent, { q, cursor, sort, what, when, from: whenFrom, to: whenTo }, { me, models, search }) => {
const decodedCursor = decodeCursor(cursor) const decodedCursor = decodeCursor(cursor)
let sitems = null let sitems = null
let termQueries = []
// short circuit: return empty result if either: // short circuit: return empty result if either:
// 1. no query provided, or // 1. no query provided, or
@ -185,120 +186,56 @@ export default {
} }
} }
// build query in parts: const whatArr = []
// filters: determine the universe of potential search candidates
// termQueries: queries related to the actual search terms
// functions: rank modifiers to boost by recency or popularity
const filters = []
const termQueries = []
const functions = []
// filters for item types
switch (what) { switch (what) {
case 'posts': // posts only case 'posts':
filters.push({ bool: { must_not: { exists: { field: 'parentId' } } } }) whatArr.push({ bool: { must_not: { exists: { field: 'parentId' } } } })
break break
case 'comments': // comments only case 'comments':
filters.push({ bool: { must: { exists: { field: 'parentId' } } } }) whatArr.push({ bool: { must: { exists: { field: 'parentId' } } } })
break break
case 'bookmarks': case 'bookmarks':
if (me?.id) { if (me?.id) {
filters.push({ match: { bookmarkedBy: me?.id } }) whatArr.push({ match: { bookmarkedBy: me?.id } })
} }
break break
default: default:
break break
} }
// filter for active posts
filters.push(
me
? {
bool: {
should: [
{ match: { status: 'ACTIVE' } },
{ match: { status: 'NOSATS' } },
{ match: { userId: me.id } }
]
}
}
: {
bool: {
should: [
{ match: { status: 'ACTIVE' } },
{ match: { status: 'NOSATS' } }
]
}
}
)
// filter for time range
const whenRange = when === 'custom'
? {
gte: whenFrom,
lte: new Date(Math.min(new Date(Number(whenTo)), decodedCursor.time))
}
: {
lte: decodedCursor.time,
gte: whenToFrom(when)
}
filters.push({ range: { createdAt: whenRange } })
// filter for non negative wvotes
filters.push({ range: { wvotes: { gte: 0 } } })
// decompose the search terms
const { query: _query, quotes, nym, url, territory } = queryParts(q) const { query: _query, quotes, nym, url, territory } = queryParts(q)
const query = _query let query = _query
const isUrlSearch = url && query.length === 0 // exclusively searching for an url
// if search contains a url term, modify the query text
if (url) { if (url) {
const uri = url.slice(4) const isFQDN = url.startsWith('url:www.')
let uriObj const domain = isFQDN ? url.slice(8) : url.slice(4)
try { const fqdn = `www.${domain}`
uriObj = new URL(uri) query = (isUrlSearch) ? `${domain} ${fqdn}` : `${query.trim()} ${domain}`
} catch {
try {
uriObj = new URL(`https://${uri}`)
} catch {}
}
if (uriObj) {
termQueries.push({
wildcard: { url: `*${uriObj?.hostname ?? uri}${uriObj?.pathname ?? ''}*` }
})
termQueries.push({
match: { text: `${uriObj?.hostname ?? uri}${uriObj?.pathname ?? ''}` }
})
}
} }
// if nym, items must contain nym
if (nym) { if (nym) {
filters.push({ wildcard: { 'user.name': `*${nym.slice(1).toLowerCase()}*` } }) whatArr.push({ wildcard: { 'user.name': `*${nym.slice(1).toLowerCase()}*` } })
// push same requirement to termQueries to avoid empty should clause
termQueries.push({ wildcard: { 'user.name': `*${nym.slice(1).toLowerCase()}*` } })
} }
// if territory, item must be from territory
if (territory) { if (territory) {
filters.push({ match: { 'sub.name': territory.slice(1) } }) whatArr.push({ match: { 'sub.name': territory.slice(1) } })
// push same requirement to termQueries to avoid empty should clause
termQueries.push({ match: { 'sub.name': territory.slice(1) } })
} }
// if quoted phrases, items must contain entire phrase termQueries.push({
// all terms are matched in fields
multi_match: {
query,
type: 'best_fields',
fields: ['title^100', 'text'],
minimum_should_match: (isUrlSearch) ? 1 : '100%',
boost: 1000
}
})
for (const quote of quotes) { for (const quote of quotes) {
termQueries.push({ whatArr.push({
multi_match: {
query: quote,
type: 'phrase',
fields: ['title', 'text']
}
})
// force the search to include the quoted phrase
filters.push({
multi_match: { multi_match: {
query: quote, query: quote,
type: 'phrase', type: 'phrase',
@ -307,104 +244,84 @@ export default {
}) })
} }
// functions for boosting search rank by recency or popularity // if we search for an exact string only, everything must match
// so score purely on sort field
let boostMode = query ? 'multiply' : 'replace'
let sortField
let sortMod = 'log1p'
switch (sort) { switch (sort) {
case 'comments': case 'comments':
functions.push({ sortField = 'ncomments'
field_value_factor: { sortMod = 'square'
field: 'ncomments',
modifier: 'log1p'
}
})
break break
case 'sats': case 'sats':
functions.push({ sortField = 'sats'
field_value_factor: {
field: 'sats',
modifier: 'log1p'
}
})
break break
case 'recent': case 'recent':
functions.push({ sortField = 'createdAt'
gauss: { sortMod = 'square'
createdAt: { boostMode = 'replace'
origin: 'now',
scale: '7d',
decay: 0.5
}
}
})
break
case 'zaprank':
functions.push({
field_value_factor: {
field: 'wvotes',
modifier: 'log1p'
}
})
break break
default: default:
sortField = 'wvotes'
sortMod = 'none'
break break
} }
let osQuery = { const functions = [
function_score: { {
query: { field_value_factor: {
bool: { field: sortField,
filter: filters, modifier: sortMod,
should: termQueries, factor: 1.2
minimum_should_match: termQueries.length > 0 ? 1 : 0 }
}
},
functions,
score_mode: 'multiply',
boost_mode: 'multiply'
} }
]
if (sort === 'recent' && !isUrlSearch) {
// prioritize exact matches
termQueries.push({
multi_match: {
query,
type: 'phrase',
fields: ['title^100', 'text'],
boost: 1000
}
})
} else {
// allow fuzzy matching with partial matches
termQueries.push({
multi_match: {
query,
type: 'most_fields',
fields: ['title^100', 'text'],
fuzziness: 'AUTO',
prefix_length: 3,
minimum_should_match: (isUrlSearch) ? 1 : '60%'
}
})
functions.push({
// small bias toward posts with comments
field_value_factor: {
field: 'ncomments',
modifier: 'ln1p',
factor: 1
}
},
{
// small bias toward recent posts
field_value_factor: {
field: 'createdAt',
modifier: 'log1p',
factor: 1
}
})
} }
// query for search terms
if (query.length) { if (query.length) {
// keyword based subquery, to be used on its own or in conjunction with a neural // if we have a model id and we aren't sort by recent, use neural search
// search if (process.env.OPENSEARCH_MODEL_ID && sort !== 'recent') {
const subquery = [ termQueries = {
{
multi_match: {
query,
type: 'best_fields',
fields: ['title^10', 'text'],
fuzziness: 'AUTO',
minimum_should_match: 1
}
},
// all match matches higher
{
multi_match: {
query,
type: 'best_fields',
fields: ['title^10', 'text'],
minimum_should_match: '100%',
boost: 100
}
},
// phrase match matches higher
{
multi_match: {
query,
type: 'phrase',
fields: ['title^10', 'text'],
boost: 1000
}
}
]
osQuery.function_score.query.bool.should = [...termQueries, ...subquery]
osQuery.function_score.query.bool.minimum_should_match = 1
// use hybrid neural search if model id is available, otherwise use only
// keyword search
if (process.env.OPENSEARCH_MODEL_ID) {
osQuery = {
hybrid: { hybrid: {
queries: [ queries: [
{ {
@ -428,18 +345,32 @@ export default {
} }
} }
} }
], ]
filter: filters,
minimum_should_match: 1
} }
}, },
osQuery {
bool: {
should: termQueries
}
}
] ]
} }
} }
} }
} else {
termQueries = []
} }
const whenRange = when === 'custom'
? {
gte: whenFrom,
lte: new Date(Math.min(new Date(Number(whenTo)), decodedCursor.time))
}
: {
lte: decodedCursor.time,
gte: whenToFrom(when)
}
try { try {
sitems = await search.search({ sitems = await search.search({
index: process.env.OPENSEARCH_INDEX, index: process.env.OPENSEARCH_INDEX,
@ -453,7 +384,45 @@ export default {
}, },
from: decodedCursor.offset, from: decodedCursor.offset,
body: { body: {
query: osQuery, query: {
function_score: {
query: {
bool: {
must: termQueries,
filter: [
...whatArr,
me
? {
bool: {
should: [
{ match: { status: 'ACTIVE' } },
{ match: { status: 'NOSATS' } },
{ match: { userId: me.id } }
]
}
}
: {
bool: {
should: [
{ match: { status: 'ACTIVE' } },
{ match: { status: 'NOSATS' } }
]
}
},
{
range:
{
createdAt: whenRange
}
},
{ range: { wvotes: { gte: 0 } } }
]
}
},
functions,
boost_mode: boostMode
}
},
highlight: { highlight: {
fields: { fields: {
title: { number_of_fragments: 0, pre_tags: ['***'], post_tags: ['***'] }, title: { number_of_fragments: 0, pre_tags: ['***'], post_tags: ['***'] },
@ -489,7 +458,7 @@ export default {
${SELECT}, rank ${SELECT}, rank
FROM "Item" FROM "Item"
JOIN r ON "Item".id = r.id`, JOIN r ON "Item".id = r.id`,
orderBy: 'ORDER BY rank ASC, msats DESC' orderBy: 'ORDER BY rank ASC'
})).map((item, i) => { })).map((item, i) => {
const e = sitems.body.hits.hits[i] const e = sitems.body.hits.hits[i]
item.searchTitle = (e.highlight?.title && e.highlight.title[0]) || item.title item.searchTitle = (e.highlight?.title && e.highlight.title[0]) || item.title

76
api/resolvers/serial.js Normal file
View File

@ -0,0 +1,76 @@
import { GraphQLError } from 'graphql'
import retry from 'async-retry'
import Prisma from '@prisma/client'
import { msatsToSats, numWithUnits } from '@/lib/format'
import { BALANCE_LIMIT_MSATS } from '@/lib/constants'
export default async function serialize (trx, { models, lnd }) {
// wrap first argument in array if not array already
const isArray = Array.isArray(trx)
if (!isArray) trx = [trx]
// conditional queries can be added inline using && syntax
// we filter any falsy value out here
trx = trx.filter(q => !!q)
const results = await retry(async bail => {
try {
const [, ...results] = await models.$transaction(
[models.$executeRaw`SELECT ASSERT_SERIALIZED()`, ...trx],
{ isolationLevel: Prisma.TransactionIsolationLevel.Serializable })
return results
} catch (error) {
console.log(error)
// two cases where we get insufficient funds:
// 1. plpgsql function raises
// 2. constraint violation via a prisma call
// XXX prisma does not provide a way to distinguish these cases so we
// have to check the error message
if (error.message.includes('SN_INSUFFICIENT_FUNDS') ||
error.message.includes('\\"users\\" violates check constraint \\"msats_positive\\"')) {
bail(new GraphQLError('insufficient funds', { extensions: { code: 'BAD_INPUT' } }))
}
if (error.message.includes('SN_NOT_SERIALIZABLE')) {
bail(new Error('wallet balance transaction is not serializable'))
}
if (error.message.includes('SN_CONFIRMED_WITHDRAWL_EXISTS')) {
bail(new Error('withdrawal invoice already confirmed (to withdraw again create a new invoice)'))
}
if (error.message.includes('SN_PENDING_WITHDRAWL_EXISTS')) {
bail(new Error('withdrawal invoice exists and is pending'))
}
if (error.message.includes('SN_INELIGIBLE')) {
bail(new Error('user ineligible for gift'))
}
if (error.message.includes('SN_UNSUPPORTED')) {
bail(new Error('unsupported action'))
}
if (error.message.includes('SN_DUPLICATE')) {
bail(new Error('duplicate not allowed'))
}
if (error.message.includes('SN_REVOKED_OR_EXHAUSTED')) {
bail(new Error('faucet has been revoked or is exhausted'))
}
if (error.message.includes('SN_INV_PENDING_LIMIT')) {
bail(new Error('too many pending invoices'))
}
if (error.message.includes('SN_INV_EXCEED_BALANCE')) {
bail(new Error(`pending invoices and withdrawals must not cause balance to exceed ${numWithUnits(msatsToSats(BALANCE_LIMIT_MSATS))}`))
}
if (error.message.includes('40001') || error.code === 'P2034') {
throw new Error('wallet balance serialization failure - try again')
}
if (error.message.includes('23514') || ['P2002', 'P2003', 'P2004'].includes(error.code)) {
bail(new Error('constraint failure'))
}
bail(error)
}
}, {
minTimeout: 10,
maxTimeout: 100,
retries: 10
})
// if first argument was not an array, unwrap the result
return isArray ? results : results[0]
}

View File

@ -1,10 +1,10 @@
import { GraphQLError } from 'graphql'
import { whenRange } from '@/lib/time' import { whenRange } from '@/lib/time'
import { validateSchema, territorySchema } from '@/lib/validate' import { ssValidate, territorySchema } from '@/lib/validate'
import { decodeCursor, LIMIT, nextCursorEncoded } from '@/lib/cursor' import { decodeCursor, LIMIT, nextCursorEncoded } from '@/lib/cursor'
import { viewGroup } from './growth' import { viewGroup } from './growth'
import { notifyTerritoryTransfer } from '@/lib/webPush' import { notifyTerritoryTransfer } from '@/lib/webPush'
import performPaidAction from '../paidAction' import performPaidAction from '../paidAction'
import { GqlAuthenticationError, GqlInputError } from '@/lib/error'
export async function getSub (parent, { name }, { models, me }) { export async function getSub (parent, { name }, { models, me }) {
if (!name) return null if (!name) return null
@ -108,12 +108,12 @@ export default {
}, },
userSubs: async (_parent, { name, cursor, when, by, from, to, limit = LIMIT }, { models }) => { userSubs: async (_parent, { name, cursor, when, by, from, to, limit = LIMIT }, { models }) => {
if (!name) { if (!name) {
throw new GqlInputError('must supply user name') throw new GraphQLError('must supply user name', { extensions: { code: 'BAD_INPUT' } })
} }
const user = await models.user.findUnique({ where: { name } }) const user = await models.user.findUnique({ where: { name } })
if (!user) { if (!user) {
throw new GqlInputError('no user has that name') throw new GraphQLError('no user has that name', { extensions: { code: 'BAD_INPUT' } })
} }
const decodedCursor = decodeCursor(cursor) const decodedCursor = decodeCursor(cursor)
@ -154,10 +154,10 @@ export default {
Mutation: { Mutation: {
upsertSub: async (parent, { ...data }, { me, models, lnd }) => { upsertSub: async (parent, { ...data }, { me, models, lnd }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
await validateSchema(territorySchema, data, { models, me, sub: { name: data.oldName } }) await ssValidate(territorySchema, data, { models, me, sub: { name: data.oldName } })
if (data.oldName) { if (data.oldName) {
return await updateSub(parent, data, { me, models, lnd }) return await updateSub(parent, data, { me, models, lnd })
@ -174,11 +174,11 @@ export default {
}) })
if (!sub) { if (!sub) {
throw new GqlInputError('sub not found') throw new GraphQLError('sub not found', { extensions: { code: 'BAD_INPUT' } })
} }
if (sub.userId !== me.id) { if (sub.userId !== me.id) {
throw new GqlInputError('you do not own this sub') throw new GraphQLError('you do not own this sub', { extensions: { code: 'BAD_INPUT' } })
} }
if (sub.status === 'ACTIVE') { if (sub.status === 'ACTIVE') {
@ -189,7 +189,7 @@ export default {
}, },
toggleMuteSub: async (parent, { name }, { me, models }) => { toggleMuteSub: async (parent, { name }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const lookupData = { userId: Number(me.id), subName: name } const lookupData = { userId: Number(me.id), subName: name }
@ -205,7 +205,7 @@ export default {
}, },
toggleSubSubscription: async (sub, { name }, { me, models }) => { toggleSubSubscription: async (sub, { name }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const lookupData = { userId: me.id, subName: name } const lookupData = { userId: me.id, subName: name }
@ -221,7 +221,7 @@ export default {
}, },
transferTerritory: async (parent, { subName, userName }, { me, models }) => { transferTerritory: async (parent, { subName, userName }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const sub = await models.sub.findUnique({ const sub = await models.sub.findUnique({
@ -230,18 +230,18 @@ export default {
} }
}) })
if (!sub) { if (!sub) {
throw new GqlInputError('sub not found') throw new GraphQLError('sub not found', { extensions: { code: 'BAD_INPUT' } })
} }
if (sub.userId !== me.id) { if (sub.userId !== me.id) {
throw new GqlInputError('you do not own this sub') throw new GraphQLError('you do not own this sub', { extensions: { code: 'BAD_INPUT' } })
} }
const user = await models.user.findFirst({ where: { name: userName } }) const user = await models.user.findFirst({ where: { name: userName } })
if (!user) { if (!user) {
throw new GqlInputError('user not found') throw new GraphQLError('user not found', { extensions: { code: 'BAD_INPUT' } })
} }
if (user.id === me.id) { if (user.id === me.id) {
throw new GqlInputError('cannot transfer territory to yourself') throw new GraphQLError('cannot transfer territory to yourself', { extensions: { code: 'BAD_INPUT' } })
} }
const [, updatedSub] = await models.$transaction([ const [, updatedSub] = await models.$transaction([
@ -255,25 +255,25 @@ export default {
}, },
unarchiveTerritory: async (parent, { ...data }, { me, models, lnd }) => { unarchiveTerritory: async (parent, { ...data }, { me, models, lnd }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const { name } = data const { name } = data
await validateSchema(territorySchema, data, { models, me }) await ssValidate(territorySchema, data, { models, me, sub: { name } })
const oldSub = await models.sub.findUnique({ where: { name } }) const oldSub = await models.sub.findUnique({ where: { name } })
if (!oldSub) { if (!oldSub) {
throw new GqlInputError('sub not found') throw new GraphQLError('sub not found', { extensions: { code: 'BAD_INPUT' } })
} }
if (oldSub.status !== 'STOPPED') { if (oldSub.status !== 'STOPPED') {
throw new GqlInputError('sub is not archived') throw new GraphQLError('sub is not archived', { extensions: { code: 'BAD_INPUT' } })
} }
if (oldSub.billingType === 'ONCE') { if (oldSub.billingType === 'ONCE') {
// sanity check. this should never happen but leaving this comment here // sanity check. this should never happen but leaving this comment here
// to stop error propagation just in case and document that this should never happen. // to stop error propagation just in case and document that this should never happen.
// #defensivecode // #defensivecode
throw new GqlInputError('sub should not be archived') throw new GraphQLError('sub should not be archived', { extensions: { code: 'BAD_INPUT' } })
} }
return await performPaidAction('TERRITORY_UNARCHIVE', data, { me, models, lnd }) return await performPaidAction('TERRITORY_UNARCHIVE', data, { me, models, lnd })
@ -319,7 +319,7 @@ async function createSub (parent, data, { me, models, lnd }) {
return await performPaidAction('TERRITORY_CREATE', data, { me, models, lnd }) return await performPaidAction('TERRITORY_CREATE', data, { me, models, lnd })
} catch (error) { } catch (error) {
if (error.code === 'P2002') { if (error.code === 'P2002') {
throw new GqlInputError('name taken') throw new GraphQLError('name taken', { extensions: { code: 'BAD_INPUT' } })
} }
throw error throw error
} }
@ -339,14 +339,14 @@ async function updateSub (parent, { oldName, ...data }, { me, models, lnd }) {
}) })
if (!oldSub) { if (!oldSub) {
throw new GqlInputError('sub not found') throw new GraphQLError('sub not found', { extensions: { code: 'BAD_INPUT' } })
} }
try { try {
return await performPaidAction('TERRITORY_UPDATE', { oldName, ...data }, { me, models, lnd }) return await performPaidAction('TERRITORY_UPDATE', { oldName, ...data }, { me, models, lnd })
} catch (error) { } catch (error) {
if (error.code === 'P2002') { if (error.code === 'P2002') {
throw new GqlInputError('name taken') throw new GraphQLError('name taken', { extensions: { code: 'BAD_INPUT' } })
} }
throw error throw error
} }

View File

@ -1,40 +1,27 @@
import { USER_ID, IMAGE_PIXELS_MAX, UPLOAD_SIZE_MAX, UPLOAD_SIZE_MAX_AVATAR, UPLOAD_TYPES_ALLOW, AWS_S3_URL_REGEXP, AVATAR_TYPES_ALLOW } from '@/lib/constants' import { GraphQLError } from 'graphql'
import { USER_ID, IMAGE_PIXELS_MAX, UPLOAD_SIZE_MAX, UPLOAD_SIZE_MAX_AVATAR, UPLOAD_TYPES_ALLOW } from '@/lib/constants'
import { createPresignedPost } from '@/api/s3' import { createPresignedPost } from '@/api/s3'
import { GqlAuthenticationError, GqlInputError } from '@/lib/error'
import { msatsToSats } from '@/lib/format'
export default { export default {
Query: {
uploadFees: async (parent, { s3Keys }, { models, me }) => {
return uploadFees(s3Keys, { models, me })
}
},
Mutation: { Mutation: {
getSignedPOST: async (parent, { type, size, width, height, avatar }, { models, me }) => { getSignedPOST: async (parent, { type, size, width, height, avatar }, { models, me }) => {
if (UPLOAD_TYPES_ALLOW.indexOf(type) === -1) { if (UPLOAD_TYPES_ALLOW.indexOf(type) === -1) {
throw new GqlInputError(`upload must be ${UPLOAD_TYPES_ALLOW.map(t => t.replace(/^(image|video)\//, '')).join(', ')}`) throw new GraphQLError(`image must be ${UPLOAD_TYPES_ALLOW.map(t => t.replace('image/', '')).join(', ')}`, { extensions: { code: 'BAD_INPUT' } })
} }
if (size > UPLOAD_SIZE_MAX) { if (size > UPLOAD_SIZE_MAX) {
throw new GqlInputError(`upload must be less than ${UPLOAD_SIZE_MAX / (1024 ** 2)} megabytes`) throw new GraphQLError(`image must be less than ${UPLOAD_SIZE_MAX / (1024 ** 2)} megabytes`, { extensions: { code: 'BAD_INPUT' } })
} }
if (avatar) { if (avatar && size > UPLOAD_SIZE_MAX_AVATAR) {
if (AVATAR_TYPES_ALLOW.indexOf(type) === -1) { throw new GraphQLError(`image must be less than ${UPLOAD_SIZE_MAX_AVATAR / (1024 ** 2)} megabytes`, { extensions: { code: 'BAD_INPUT' } })
throw new GqlInputError(`avatar must be ${AVATAR_TYPES_ALLOW.map(t => t.replace('image/', '')).join(', ')}`)
}
if (size > UPLOAD_SIZE_MAX_AVATAR) {
throw new GqlInputError(`avatar must be less than ${UPLOAD_SIZE_MAX_AVATAR / (1024 ** 2)} megabytes`)
}
} }
// width and height is 0 for videos
if (width * height > IMAGE_PIXELS_MAX) { if (width * height > IMAGE_PIXELS_MAX) {
throw new GqlInputError(`image must be less than ${IMAGE_PIXELS_MAX} pixels`) throw new GraphQLError(`image must be less than ${IMAGE_PIXELS_MAX} pixels`, { extensions: { code: 'BAD_INPUT' } })
} }
const fileParams = { const imgParams = {
type, type,
size, size,
width, width,
@ -44,27 +31,12 @@ export default {
} }
if (avatar) { if (avatar) {
if (!me) throw new GqlAuthenticationError() if (!me) throw new GraphQLError('you must be logged in', { extensions: { code: 'FORBIDDEN' } })
fileParams.paid = undefined imgParams.paid = undefined
} }
const upload = await models.upload.create({ data: { ...fileParams } }) const upload = await models.upload.create({ data: { ...imgParams } })
return createPresignedPost({ key: String(upload.id), type, size }) return createPresignedPost({ key: String(upload.id), type, size })
} }
} }
} }
export function uploadIdsFromText (text, { models }) {
if (!text) return []
return [...new Set([...text.matchAll(AWS_S3_URL_REGEXP)].map(m => Number(m[1])))]
}
export async function uploadFees (s3Keys, { models, me }) {
// returns info object in this format:
// { bytes24h: int, bytesUnpaid: int, nUnpaid: int, uploadFeesMsats: BigInt }
const [info] = await models.$queryRawUnsafe('SELECT * FROM upload_fees($1::INTEGER, $2::INTEGER[])', me ? me.id : USER_ID.anon, s3Keys)
const uploadFees = msatsToSats(info.uploadFeesMsats)
const totalFeesMsats = info.nUnpaid * Number(info.uploadFeesMsats)
const totalFees = msatsToSats(totalFeesMsats)
return { ...info, uploadFees, totalFees, totalFeesMsats }
}

View File

@ -1,16 +1,16 @@
import { readFile } from 'fs/promises' import { readFile } from 'fs/promises'
import { join, resolve } from 'path' import { join, resolve } from 'path'
import { GraphQLError } from 'graphql'
import { decodeCursor, LIMIT, nextCursorEncoded } from '@/lib/cursor' import { decodeCursor, LIMIT, nextCursorEncoded } from '@/lib/cursor'
import { msatsToSats } from '@/lib/format' import { msatsToSats } from '@/lib/format'
import { bioSchema, emailSchema, settingsSchema, validateSchema, userSchema } from '@/lib/validate' import { bioSchema, emailSchema, settingsSchema, ssValidate, userSchema } from '@/lib/validate'
import { getItem, updateItem, filterClause, createItem, whereClause, muteClause, activeOrMine } from './item' import { getItem, updateItem, filterClause, createItem, whereClause, muteClause, activeOrMine } from './item'
import { USER_ID, RESERVED_MAX_USER_ID, SN_NO_REWARDS_IDS, INVOICE_ACTION_NOTIFICATION_TYPES, WALLET_MAX_RETRIES, WALLET_RETRY_BEFORE_MS } from '@/lib/constants' import { USER_ID, RESERVED_MAX_USER_ID, SN_NO_REWARDS_IDS, INVOICE_ACTION_NOTIFICATION_TYPES } from '@/lib/constants'
import { viewGroup } from './growth' import { viewGroup } from './growth'
import { datePivot, timeUnitForRange, whenRange } from '@/lib/time' import { timeUnitForRange, whenRange } from '@/lib/time'
import assertApiKeyNotPermitted from './apiKey' import assertApiKeyNotPermitted from './apiKey'
import { hashEmail } from '@/lib/crypto' import { hashEmail } from '@/lib/crypto'
import { isMuted } from '@/lib/user' import { isMuted } from '@/lib/user'
import { GqlAuthenticationError, GqlAuthorizationError, GqlInputError } from '@/lib/error'
const contributors = new Set() const contributors = new Set()
@ -66,12 +66,11 @@ export async function topUsers (parent, { cursor, when, by, from, to, limit = LI
case 'comments': column = 'ncomments'; break case 'comments': column = 'ncomments'; break
case 'referrals': column = 'referrals'; break case 'referrals': column = 'referrals'; break
case 'stacking': column = 'stacked'; break case 'stacking': column = 'stacked'; break
case 'value':
default: column = 'proportion'; break default: column = 'proportion'; break
} }
const users = (await models.$queryRawUnsafe(` const users = (await models.$queryRawUnsafe(`
SELECT * ${column === 'proportion' ? ', proportion' : ''} SELECT *
FROM FROM
(SELECT users.*, (SELECT users.*,
COALESCE(floor(sum(msats_spent)/1000), 0) as spent, COALESCE(floor(sum(msats_spent)/1000), 0) as spent,
@ -126,14 +125,13 @@ export default {
}, },
settings: async (parent, args, { models, me }) => { settings: async (parent, args, { models, me }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
return await models.user.findUnique({ where: { id: me.id } }) return await models.user.findUnique({ where: { id: me.id } })
}, },
user: async (parent, { id, name }, { models }) => { user: async (parent, { name }, { models }) => {
if (id) id = Number(id) return await models.user.findUnique({ where: { name } })
return await models.user.findUnique({ where: { id, name } })
}, },
users: async (parent, args, { models }) => users: async (parent, args, { models }) =>
await models.user.findMany(), await models.user.findMany(),
@ -146,7 +144,7 @@ export default {
}, },
mySubscribedUsers: async (parent, { cursor }, { models, me }) => { mySubscribedUsers: async (parent, { cursor }, { models, me }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('You must be logged in to view subscribed users', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const decodedCursor = decodeCursor(cursor) const decodedCursor = decodeCursor(cursor)
@ -167,7 +165,7 @@ export default {
}, },
myMutedUsers: async (parent, { cursor }, { models, me }) => { myMutedUsers: async (parent, { cursor }, { models, me }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('You must be logged in to view muted users', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const decodedCursor = decodeCursor(cursor) const decodedCursor = decodeCursor(cursor)
@ -285,7 +283,6 @@ export default {
'"ThreadSubscription"."userId" = $1', '"ThreadSubscription"."userId" = $1',
'r.created_at > $2', 'r.created_at > $2',
'r.created_at >= "ThreadSubscription".created_at', 'r.created_at >= "ThreadSubscription".created_at',
'r."userId" <> $1',
activeOrMine(me), activeOrMine(me),
await filterClause(me, models), await filterClause(me, models),
muteClause(me), muteClause(me),
@ -397,6 +394,22 @@ export default {
} }
} }
const job = await models.item.findFirst({
where: {
maxBid: {
not: null
},
userId: me.id,
statusUpdatedAt: {
gt: lastChecked
}
}
})
if (job && job.statusUpdatedAt > job.createdAt) {
foundNotes()
return true
}
if (user.noteEarning) { if (user.noteEarning) {
const earn = await models.earn.findFirst({ const earn = await models.earn.findFirst({
where: { where: {
@ -422,16 +435,8 @@ export default {
confirmedAt: { confirmedAt: {
gt: lastChecked gt: lastChecked
}, },
OR: [ isHeld: null,
{ actionType: null
isHeld: null,
actionType: null
},
{
actionType: 'RECEIVE',
actionState: 'PAID'
}
]
} }
}) })
if (invoice) { if (invoice) {
@ -445,13 +450,9 @@ export default {
where: { where: {
userId: me.id, userId: me.id,
status: 'CONFIRMED', status: 'CONFIRMED',
hash: {
not: null
},
updatedAt: { updatedAt: {
gt: lastChecked gt: lastChecked
}, }
invoiceForward: { is: null }
} }
}) })
if (wdrwl) { if (wdrwl) {
@ -543,17 +544,7 @@ export default {
actionType: { actionType: {
in: INVOICE_ACTION_NOTIFICATION_TYPES in: INVOICE_ACTION_NOTIFICATION_TYPES
}, },
actionState: 'FAILED', actionState: 'FAILED'
OR: [
{
paymentAttempt: {
gte: WALLET_MAX_RETRIES
}
},
{
userCancel: true
}
]
} }
}) })
@ -562,31 +553,6 @@ export default {
return true return true
} }
const invoiceActionFailed2 = await models.invoice.findFirst({
where: {
userId: me.id,
updatedAt: {
gt: datePivot(lastChecked, { milliseconds: -WALLET_RETRY_BEFORE_MS })
},
actionType: {
in: INVOICE_ACTION_NOTIFICATION_TYPES
},
actionState: 'FAILED',
paymentAttempt: {
lt: WALLET_MAX_RETRIES
},
userCancel: false,
cancelledAt: {
lte: datePivot(new Date(), { milliseconds: -WALLET_RETRY_BEFORE_MS })
}
}
})
if (invoiceActionFailed2) {
foundNotes()
return true
}
// update checkedNotesAt to prevent rechecking same time period // update checkedNotesAt to prevent rechecking same time period
models.user.update({ models.user.update({
where: { id: me.id }, where: { id: me.id },
@ -655,49 +621,29 @@ export default {
}, },
Mutation: { Mutation: {
disableFreebies: async (parent, args, { me, models }) => {
if (!me) {
throw new GqlAuthenticationError()
}
// disable freebies if it hasn't been set yet
try {
await models.user.update({
where: { id: me.id, disableFreebies: null },
data: { disableFreebies: true }
})
} catch (err) {
// ignore 'record not found' errors
if (err.code !== 'P2025') {
throw err
}
}
return true
},
setName: async (parent, data, { me, models }) => { setName: async (parent, data, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
await validateSchema(userSchema, data, { models }) await ssValidate(userSchema, data, { models })
try { try {
await models.user.update({ where: { id: me.id }, data }) await models.user.update({ where: { id: me.id }, data })
return data.name return data.name
} catch (error) { } catch (error) {
if (error.code === 'P2002') { if (error.code === 'P2002') {
throw new GqlInputError('name taken') throw new GraphQLError('name taken', { extensions: { code: 'BAD_INPUT' } })
} }
throw error throw error
} }
}, },
setSettings: async (parent, { settings: { nostrRelays, ...data } }, { me, models }) => { setSettings: async (parent, { settings: { nostrRelays, ...data } }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
await validateSchema(settingsSchema, { nostrRelays, ...data }) await ssValidate(settingsSchema, { nostrRelays, ...data })
if (nostrRelays?.length) { if (nostrRelays?.length) {
const connectOrCreate = [] const connectOrCreate = []
@ -720,7 +666,7 @@ export default {
}, },
setWalkthrough: async (parent, { upvotePopover, tipPopover }, { me, models }) => { setWalkthrough: async (parent, { upvotePopover, tipPopover }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
await models.user.update({ where: { id: me.id }, data: { upvotePopover, tipPopover } }) await models.user.update({ where: { id: me.id }, data: { upvotePopover, tipPopover } })
@ -729,7 +675,7 @@ export default {
}, },
setPhoto: async (parent, { photoId }, { me, models }) => { setPhoto: async (parent, { photoId }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
await models.user.update({ await models.user.update({
@ -739,29 +685,31 @@ export default {
return Number(photoId) return Number(photoId)
}, },
upsertBio: async (parent, { text }, { me, models, lnd }) => { upsertBio: async (parent, { bio }, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
await validateSchema(bioSchema, { text }) await ssValidate(bioSchema, { bio })
const user = await models.user.findUnique({ where: { id: me.id } }) const user = await models.user.findUnique({ where: { id: me.id } })
if (user.bioId) { if (user.bioId) {
return await updateItem(parent, { id: user.bioId, bio: true, text, title: `@${user.name}'s bio` }, { me, models, lnd }) await updateItem(parent, { id: user.bioId, text: bio, title: `@${user.name}'s bio` }, { me, models })
} else { } else {
return await createItem(parent, { bio: true, text, title: `@${user.name}'s bio` }, { me, models, lnd }) await createItem(parent, { bio: true, text: bio, title: `@${user.name}'s bio` }, { me, models })
} }
return await models.user.findUnique({ where: { id: me.id } })
}, },
generateApiKey: async (parent, { id }, { models, me }) => { generateApiKey: async (parent, { id }, { models, me }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
const user = await models.user.findUnique({ where: { id: me.id } }) const user = await models.user.findUnique({ where: { id: me.id } })
if (!user.apiKeyEnabled) { if (!user.apiKeyEnabled) {
throw new GqlAuthorizationError('you are not allowed to generate api keys') throw new GraphQLError('you are not allowed to generate api keys', { extensions: { code: 'FORBIDDEN' } })
} }
// I trust postgres CSPRNG more than the one from JS // I trust postgres CSPRNG more than the one from JS
@ -776,14 +724,14 @@ export default {
}, },
deleteApiKey: async (parent, { id }, { models, me }) => { deleteApiKey: async (parent, { id }, { models, me }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
return await models.user.update({ where: { id: me.id }, data: { apiKeyHash: null } }) return await models.user.update({ where: { id: me.id }, data: { apiKeyHash: null } })
}, },
unlinkAuth: async (parent, { authType }, { models, me }) => { unlinkAuth: async (parent, { authType }, { models, me }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
assertApiKeyNotPermitted({ me }) assertApiKeyNotPermitted({ me })
@ -792,7 +740,7 @@ export default {
user = await models.user.findUnique({ where: { id: me.id } }) user = await models.user.findUnique({ where: { id: me.id } })
const account = await models.account.findFirst({ where: { userId: me.id, provider: authType } }) const account = await models.account.findFirst({ where: { userId: me.id, provider: authType } })
if (!account) { if (!account) {
throw new GqlInputError('no such account') throw new GraphQLError('no such account', { extensions: { code: 'BAD_INPUT' } })
} }
await models.account.delete({ where: { id: account.id } }) await models.account.delete({ where: { id: account.id } })
if (authType === 'twitter') { if (authType === 'twitter') {
@ -807,18 +755,18 @@ export default {
} else if (authType === 'email') { } else if (authType === 'email') {
user = await models.user.update({ where: { id: me.id }, data: { email: null, emailVerified: null, emailHash: null } }) user = await models.user.update({ where: { id: me.id }, data: { email: null, emailVerified: null, emailHash: null } })
} else { } else {
throw new GqlInputError('no such account') throw new GraphQLError('no such account', { extensions: { code: 'BAD_INPUT' } })
} }
return await authMethods(user, undefined, { models, me }) return await authMethods(user, undefined, { models, me })
}, },
linkUnverifiedEmail: async (parent, { email }, { models, me }) => { linkUnverifiedEmail: async (parent, { email }, { models, me }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
assertApiKeyNotPermitted({ me }) assertApiKeyNotPermitted({ me })
await validateSchema(emailSchema, { email }) await ssValidate(emailSchema, { email })
try { try {
await models.user.update({ await models.user.update({
@ -827,7 +775,7 @@ export default {
}) })
} catch (error) { } catch (error) {
if (error.code === 'P2002') { if (error.code === 'P2002') {
throw new GqlInputError('email taken') throw new GraphQLError('email taken', { extensions: { code: 'BAD_INPUT' } })
} }
throw error throw error
} }
@ -840,12 +788,12 @@ export default {
const muted = await isMuted({ models, muterId: me?.id, mutedId: id }) const muted = await isMuted({ models, muterId: me?.id, mutedId: id })
if (existing) { if (existing) {
if (muted && !existing.postsSubscribedAt) { if (muted && !existing.postsSubscribedAt) {
throw new GqlInputError("you can't subscribe to a stacker that you've muted") throw new GraphQLError("you can't subscribe to a stacker that you've muted", { extensions: { code: 'BAD_INPUT' } })
} }
await models.userSubscription.update({ where: { followerId_followeeId: lookupData }, data: { postsSubscribedAt: existing.postsSubscribedAt ? null : new Date() } }) await models.userSubscription.update({ where: { followerId_followeeId: lookupData }, data: { postsSubscribedAt: existing.postsSubscribedAt ? null : new Date() } })
} else { } else {
if (muted) { if (muted) {
throw new GqlInputError("you can't subscribe to a stacker that you've muted") throw new GraphQLError("you can't subscribe to a stacker that you've muted", { extensions: { code: 'BAD_INPUT' } })
} }
await models.userSubscription.create({ data: { ...lookupData, postsSubscribedAt: new Date() } }) await models.userSubscription.create({ data: { ...lookupData, postsSubscribedAt: new Date() } })
} }
@ -857,12 +805,12 @@ export default {
const muted = await isMuted({ models, muterId: me?.id, mutedId: id }) const muted = await isMuted({ models, muterId: me?.id, mutedId: id })
if (existing) { if (existing) {
if (muted && !existing.commentsSubscribedAt) { if (muted && !existing.commentsSubscribedAt) {
throw new GqlInputError("you can't subscribe to a stacker that you've muted") throw new GraphQLError("you can't subscribe to a stacker that you've muted", { extensions: { code: 'BAD_INPUT' } })
} }
await models.userSubscription.update({ where: { followerId_followeeId: lookupData }, data: { commentsSubscribedAt: existing.commentsSubscribedAt ? null : new Date() } }) await models.userSubscription.update({ where: { followerId_followeeId: lookupData }, data: { commentsSubscribedAt: existing.commentsSubscribedAt ? null : new Date() } })
} else { } else {
if (muted) { if (muted) {
throw new GqlInputError("you can't subscribe to a stacker that you've muted") throw new GraphQLError("you can't subscribe to a stacker that you've muted", { extensions: { code: 'BAD_INPUT' } })
} }
await models.userSubscription.create({ data: { ...lookupData, commentsSubscribedAt: new Date() } }) await models.userSubscription.create({ data: { ...lookupData, commentsSubscribedAt: new Date() } })
} }
@ -885,7 +833,7 @@ export default {
} }
}) })
if (subscription?.postsSubscribedAt || subscription?.commentsSubscribedAt) { if (subscription?.postsSubscribedAt || subscription?.commentsSubscribedAt) {
throw new GqlInputError("you can't mute a stacker to whom you've subscribed") throw new GraphQLError("you can't mute a stacker to whom you've subscribed", { extensions: { code: 'BAD_INPUT' } })
} }
await models.mute.create({ data: { ...lookupData } }) await models.mute.create({ data: { ...lookupData } })
} }
@ -893,7 +841,7 @@ export default {
}, },
hideWelcomeBanner: async (parent, data, { me, models }) => { hideWelcomeBanner: async (parent, data, { me, models }) => {
if (!me) { if (!me) {
throw new GqlAuthenticationError() throw new GraphQLError('you must be logged in', { extensions: { code: 'UNAUTHENTICATED' } })
} }
await models.user.update({ where: { id: me.id }, data: { hideWelcomeBanner: true } }) await models.user.update({ where: { id: me.id }, data: { hideWelcomeBanner: true } })
@ -950,8 +898,7 @@ export default {
// get the user's first item // get the user's first item
const item = await models.item.findFirst({ const item = await models.item.findFirst({
where: { where: {
userId: user.id, userId: user.id
OR: [{ invoiceActionState: 'PAID' }, { invoiceActionState: null }]
}, },
orderBy: { orderBy: {
createdAt: 'asc' createdAt: 'asc'
@ -971,8 +918,7 @@ export default {
createdAt: { createdAt: {
gte, gte,
lte lte
}, }
OR: [{ invoiceActionState: 'PAID' }, { invoiceActionState: null }]
} }
}) })
}, },
@ -989,8 +935,7 @@ export default {
createdAt: { createdAt: {
gte, gte,
lte lte
}, }
OR: [{ invoiceActionState: 'PAID' }, { invoiceActionState: null }]
} }
}) })
}, },
@ -1007,8 +952,7 @@ export default {
createdAt: { createdAt: {
gte, gte,
lte lte
}, }
OR: [{ invoiceActionState: 'PAID' }, { invoiceActionState: null }]
} }
}) })
}, },
@ -1039,13 +983,7 @@ export default {
if (!me || me.id !== user.id) { if (!me || me.id !== user.id) {
return 0 return 0
} }
return msatsToSats(user.msats + user.mcredits) return msatsToSats(user.msats)
},
credits: async (user, args, { models, me }) => {
if (!me || me.id !== user.id) {
return 0
}
return msatsToSats(user.mcredits)
}, },
authMethods, authMethods,
hasInvites: async (user, args, { models }) => { hasInvites: async (user, args, { models }) => {
@ -1065,12 +1003,6 @@ export default {
}) })
return relays?.map(r => r.nostrRelayAddr) return relays?.map(r => r.nostrRelayAddr)
},
tipRandom: async (user, args, { me }) => {
if (!me || me.id !== user.id) {
return false
}
return !!user.tipRandomMin && !!user.tipRandomMax
} }
}, },
@ -1082,20 +1014,6 @@ export default {
return user.streak return user.streak
}, },
gunStreak: async (user, args, { models }) => {
if (user.hideCowboyHat) {
return null
}
return user.gunStreak
},
horseStreak: async (user, args, { models }) => {
if (user.hideCowboyHat) {
return null
}
return user.horseStreak
},
maxStreak: async (user, args, { models }) => { maxStreak: async (user, args, { models }) => {
if (user.hideCowboyHat) { if (user.hideCowboyHat) {
return null return null
@ -1127,7 +1045,7 @@ export default {
if (!when || when === 'forever') { if (!when || when === 'forever') {
// forever // forever
return ((user.stackedMsats && msatsToSats(user.stackedMsats)) || 0) return (user.stackedMsats && msatsToSats(user.stackedMsats)) || 0
} }
const range = whenRange(when, from, to) const range = whenRange(when, from, to)

View File

@ -1,75 +0,0 @@
import { E_VAULT_KEY_EXISTS, GqlAuthenticationError, GqlInputError } from '@/lib/error'
export default {
Query: {
getVaultEntry: async (parent, { key }, { me, models }, info) => {
if (!me) throw new GqlAuthenticationError()
if (!key) throw new GqlInputError('must have key')
const k = await models.vault.findUnique({
where: {
key,
userId: me.id
}
})
return k
},
getVaultEntries: async (parent, { keysFilter }, { me, models }, info) => {
if (!me) throw new GqlAuthenticationError()
const entries = await models.vaultEntry.findMany({
where: {
userId: me.id,
key: keysFilter?.length
? {
in: keysFilter
}
: undefined
}
})
return entries
}
},
Mutation: {
// atomic vault migration
updateVaultKey: async (parent, { entries, hash }, { me, models }) => {
if (!me) throw new GqlAuthenticationError()
if (!hash) throw new GqlInputError('hash required')
const txs = []
const { vaultKeyHash: oldKeyHash } = await models.user.findUnique({ where: { id: me.id } })
if (oldKeyHash) {
if (oldKeyHash !== hash) {
throw new GqlInputError('vault key already set', E_VAULT_KEY_EXISTS)
} else {
return true
}
} else {
txs.push(models.user.update({
where: { id: me.id },
data: { vaultKeyHash: hash }
}))
}
for (const entry of entries) {
txs.push(models.vaultEntry.update({
where: { userId_key: { userId: me.id, key: entry.key } },
data: { value: entry.value, iv: entry.iv }
}))
}
await models.$transaction(txs)
return true
},
clearVault: async (parent, args, { me, models }) => {
if (!me) throw new GqlAuthenticationError()
const txs = []
txs.push(models.user.update({
where: { id: me.id },
data: { vaultKeyHash: '' }
}))
txs.push(models.vaultEntry.deleteMany({ where: { userId: me.id } }))
await models.$transaction(txs)
return true
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -13,9 +13,6 @@ import { BLOCK_HEIGHT } from '@/fragments/blockHeight'
import { CHAIN_FEE } from '@/fragments/chainFee' import { CHAIN_FEE } from '@/fragments/chainFee'
import { getServerSession } from 'next-auth/next' import { getServerSession } from 'next-auth/next'
import { getAuthOptions } from '@/pages/api/auth/[...nextauth]' import { getAuthOptions } from '@/pages/api/auth/[...nextauth]'
import { NOFOLLOW_LIMIT } from '@/lib/constants'
import { satsToMsats } from '@/lib/format'
import { MULTI_AUTH_ANON, MULTI_AUTH_LIST } from '@/lib/auth'
export default async function getSSRApolloClient ({ req, res, me = null }) { export default async function getSSRApolloClient ({ req, res, me = null }) {
const session = req && await getServerSession(req, res, getAuthOptions(req)) const session = req && await getServerSession(req, res, getAuthOptions(req))
@ -43,17 +40,17 @@ export default async function getSSRApolloClient ({ req, res, me = null }) {
watchQuery: { watchQuery: {
fetchPolicy: 'no-cache', fetchPolicy: 'no-cache',
nextFetchPolicy: 'no-cache', nextFetchPolicy: 'no-cache',
canonizeResults: true,
ssr: true ssr: true
}, },
query: { query: {
fetchPolicy: 'no-cache', fetchPolicy: 'no-cache',
nextFetchPolicy: 'no-cache', nextFetchPolicy: 'no-cache',
canonizeResults: true,
ssr: true ssr: true
} }
} }
}) })
await client.clearStore()
return client return client
} }
@ -67,17 +64,7 @@ function oneDayReferral (request, { me }) {
let prismaPromise, getData let prismaPromise, getData
if (referrer.startsWith('item-')) { if (referrer.startsWith('item-')) {
prismaPromise = models.item.findUnique({ prismaPromise = models.item.findUnique({ where: { id: parseInt(referrer.slice(5)) } })
where: {
id: parseInt(referrer.slice(5)),
msats: {
gt: satsToMsats(NOFOLLOW_LIMIT)
},
weightedVotes: {
gt: 0
}
}
})
getData = item => ({ getData = item => ({
referrerId: item.userId, referrerId: item.userId,
refereeId: parseInt(me.id), refereeId: parseInt(me.id),
@ -152,20 +139,10 @@ export function getGetServerSideProps (
const client = await getSSRApolloClient({ req, res }) const client = await getSSRApolloClient({ req, res })
let { data: { me } } = await client.query({ query: ME }) const { data: { me } } = await client.query({ query: ME })
// required to redirect to /signup on page reload
// if we switched to anon and authentication is required
if (req.cookies[MULTI_AUTH_LIST] === MULTI_AUTH_ANON) {
me = null
}
if (authRequired && !me) { if (authRequired && !me) {
let callback = process.env.NEXT_PUBLIC_URL + req.url const callback = process.env.NEXT_PUBLIC_URL + req.url
// On client-side routing, the callback is a NextJS URL
// so we need to remove the NextJS stuff.
// Example: /_next/data/development/territory.json
callback = callback.replace(/\/_next\/data\/\w+\//, '/').replace(/\.json$/, '')
return { return {
redirect: { redirect: {
destination: `/signup?callbackUrl=${encodeURIComponent(callback)}` destination: `/signup?callbackUrl=${encodeURIComponent(callback)}`
@ -197,7 +174,6 @@ export function getGetServerSideProps (
} }
if (error || !data || (notFound && notFound(data, vars, me))) { if (error || !data || (notFound && notFound(data, vars, me))) {
error && console.error(error)
res.writeHead(302, { res.writeHead(302, {
Location: '/404' Location: '/404'
}).end() }).end()

View File

@ -13,8 +13,6 @@ export default gql`
spenderGrowth(when: String, from: String, to: String): [TimeData!]! spenderGrowth(when: String, from: String, to: String): [TimeData!]!
stackingGrowth(when: String, from: String, to: String): [TimeData!]! stackingGrowth(when: String, from: String, to: String): [TimeData!]!
stackerGrowth(when: String, from: String, to: String): [TimeData!]! stackerGrowth(when: String, from: String, to: String): [TimeData!]!
itemGrowthSubs(when: String, from: String, to: String, sub: String): [TimeData!]!
revenueGrowthSubs(when: String, from: String, to: String, sub: String): [TimeData!]!
} }
type TimeData { type TimeData {

16
api/typeDefs/image.js Normal file
View File

@ -0,0 +1,16 @@
import { gql } from 'graphql-tag'
export default gql`
type ImageFeesInfo {
totalFees: Int!
totalFeesMsats: Int!
imageFee: Int!
imageFeeMsats: Int!
nUnpaid: Int!
bytesUnpaid: Int!
bytes24h: Int!
}
extend type Query {
imageFeesInfo(s3Keys: [Int]!): ImageFeesInfo!
}
`

View File

@ -17,8 +17,8 @@ import price from './price'
import admin from './admin' import admin from './admin'
import blockHeight from './blockHeight' import blockHeight from './blockHeight'
import chainFee from './chainFee' import chainFee from './chainFee'
import image from './image'
import paidAction from './paidAction' import paidAction from './paidAction'
import vault from './vault'
const common = gql` const common = gql`
type Query { type Query {
@ -39,4 +39,4 @@ const common = gql`
` `
export default [common, user, item, itemForward, message, wallet, lnurl, notifications, invite, export default [common, user, item, itemForward, message, wallet, lnurl, notifications, invite,
sub, upload, growth, rewards, referrals, price, admin, blockHeight, chainFee, paidAction, vault] sub, upload, growth, rewards, referrals, price, admin, blockHeight, chainFee, image, paidAction]

View File

@ -7,7 +7,7 @@ export default gql`
} }
extend type Mutation { extend type Mutation {
createInvite(id: String, gift: Int!, limit: Int!, description: String): Invite createInvite(gift: Int!, limit: Int): Invite
revokeInvite(id: ID!): Invite revokeInvite(id: ID!): Invite
} }
@ -20,6 +20,5 @@ export default gql`
user: User! user: User!
revoked: Boolean! revoked: Boolean!
poor: Boolean! poor: Boolean!
description: String
} }
` `

View File

@ -8,18 +8,10 @@ export default gql`
dupes(url: String!): [Item!] dupes(url: String!): [Item!]
related(cursor: String, title: String, id: ID, minMatch: String, limit: Limit): Items related(cursor: String, title: String, id: ID, minMatch: String, limit: Limit): Items
search(q: String, sub: String, cursor: String, what: String, sort: String, when: String, from: String, to: String): Items search(q: String, sub: String, cursor: String, what: String, sort: String, when: String, from: String, to: String): Items
auctionPosition(sub: String, id: ID, boost: Int): Int! auctionPosition(sub: String, id: ID, bid: Int!): Int!
boostPosition(sub: String, id: ID, boost: Int): BoostPositions!
itemRepetition(parentId: ID): Int! itemRepetition(parentId: ID): Int!
} }
type BoostPositions {
home: Boolean!
sub: Boolean!
homeMaxBoost: Int!
subMaxBoost: Int!
}
type TitleUnshorted { type TitleUnshorted {
title: String title: String
unshorted: String unshorted: String
@ -43,24 +35,15 @@ export default gql`
pinItem(id: ID): Item pinItem(id: ID): Item
subscribeItem(id: ID): Item subscribeItem(id: ID): Item
deleteItem(id: ID): Item deleteItem(id: ID): Item
upsertLink( upsertLink(id: ID, sub: String, title: String!, url: String!, text: String, boost: Int, forward: [ItemForwardInput]): ItemPaidAction!
id: ID, sub: String, title: String!, url: String!, text: String, boost: Int, forward: [ItemForwardInput], upsertDiscussion(id: ID, sub: String, title: String!, text: String, boost: Int, forward: [ItemForwardInput]): ItemPaidAction!
hash: String, hmac: String): ItemPaidAction! upsertBounty(id: ID, sub: String, title: String!, text: String, bounty: Int, boost: Int, forward: [ItemForwardInput]): ItemPaidAction!
upsertDiscussion( upsertJob(id: ID, sub: String!, title: String!, company: String!, location: String, remote: Boolean,
id: ID, sub: String, title: String!, text: String, boost: Int, forward: [ItemForwardInput], text: String!, url: String!, maxBid: Int!, status: String, logo: Int): ItemPaidAction!
hash: String, hmac: String): ItemPaidAction! upsertPoll(id: ID, sub: String, title: String!, text: String, options: [String!]!, boost: Int, forward: [ItemForwardInput], pollExpiresAt: Date): ItemPaidAction!
upsertBounty(
id: ID, sub: String, title: String!, text: String, bounty: Int, boost: Int, forward: [ItemForwardInput],
hash: String, hmac: String): ItemPaidAction!
upsertJob(
id: ID, sub: String!, title: String!, company: String!, location: String, remote: Boolean,
text: String!, url: String!, boost: Int, status: String, logo: Int): ItemPaidAction!
upsertPoll(
id: ID, sub: String, title: String!, text: String, options: [String!]!, boost: Int, forward: [ItemForwardInput], pollExpiresAt: Date,
hash: String, hmac: String): ItemPaidAction!
updateNoteId(id: ID!, noteId: String!): Item! updateNoteId(id: ID!, noteId: String!): Item!
upsertComment(id: ID, text: String!, parentId: ID, boost: Int, hash: String, hmac: String): ItemPaidAction! upsertComment(id:ID, text: String!, parentId: ID): ItemPaidAction!
act(id: ID!, sats: Int, act: String, hasSendWallet: Boolean): ItemActPaidAction! act(id: ID!, sats: Int, act: String, idempotent: Boolean): ItemActPaidAction!
pollVote(id: ID!): PollVotePaidAction! pollVote(id: ID!): PollVotePaidAction!
toggleOutlaw(id: ID!): Item! toggleOutlaw(id: ID!): Item!
} }
@ -87,7 +70,6 @@ export default gql`
cursor: String cursor: String
items: [Item!]! items: [Item!]!
pins: [Item!] pins: [Item!]
ad: Item
} }
type Comments { type Comments {
@ -107,7 +89,6 @@ export default gql`
id: ID! id: ID!
createdAt: Date! createdAt: Date!
updatedAt: Date! updatedAt: Date!
invoicePaidAt: Date
deletedAt: Date deletedAt: Date
deleteScheduledAt: Date deleteScheduledAt: Date
reminderScheduledAt: Date reminderScheduledAt: Date
@ -128,13 +109,10 @@ export default gql`
bountyPaidTo: [Int] bountyPaidTo: [Int]
noteId: String noteId: String
sats: Int! sats: Int!
credits: Int!
commentSats: Int! commentSats: Int!
commentCredits: Int!
lastCommentAt: Date lastCommentAt: Date
upvotes: Int! upvotes: Int!
meSats: Int! meSats: Int!
meCredits: Int!
meDontLikeSats: Int! meDontLikeSats: Int!
meBookmark: Boolean! meBookmark: Boolean!
meSubscription: Boolean! meSubscription: Boolean!
@ -145,11 +123,11 @@ export default gql`
bio: Boolean! bio: Boolean!
paidImgLink: Boolean paidImgLink: Boolean
ncomments: Int! ncomments: Int!
nDirectComments: Int! comments(sort: String): [Item!]!
comments(sort: String, cursor: String): Comments!
path: String path: String
position: Int position: Int
prior: Int prior: Int
maxBid: Int
isJob: Boolean! isJob: Boolean!
pollCost: Int pollCost: Int
poll: Poll poll: Poll
@ -159,7 +137,7 @@ export default gql`
remote: Boolean remote: Boolean
sub: Sub sub: Sub
subName: String subName: String
status: String! status: String
uploadId: Int uploadId: Int
otsHash: String otsHash: String
parentOtsHash: String parentOtsHash: String
@ -168,7 +146,6 @@ export default gql`
rel: String rel: String
apiKey: Boolean apiKey: Boolean
invoice: Invoice invoice: Invoice
cost: Int!
} }
input ItemForwardInput { input ItemForwardInput {

View File

@ -79,7 +79,6 @@ export default gql`
id: ID! id: ID!
sortTime: Date! sortTime: Date!
days: Int days: Int
type: String!
} }
type Earn { type Earn {
@ -124,12 +123,9 @@ export default gql`
withdrawl: Withdrawl! withdrawl: Withdrawl!
} }
union ReferralSource = Item | Sub | User
type Referral { type Referral {
id: ID! id: ID!
sortTime: Date! sortTime: Date!
source: ReferralSource
} }
type SubStatus { type SubStatus {

View File

@ -7,13 +7,11 @@ extend type Query {
} }
extend type Mutation { extend type Mutation {
retryPaidAction(invoiceId: Int!, newAttempt: Boolean): PaidAction! retryPaidAction(invoiceId: Int!): PaidAction!
} }
enum PaymentMethod { enum PaymentMethod {
REWARD_SATS
FEE_CREDIT FEE_CREDIT
ZERO_COST
OPTIMISTIC OPTIMISTIC
PESSIMISTIC PESSIMISTIC
} }
@ -53,9 +51,4 @@ type DonatePaidAction implements PaidAction {
paymentMethod: PaymentMethod! paymentMethod: PaymentMethod!
} }
type BuyCreditsPaidAction implements PaidAction {
result: BuyCreditsResult
invoice: Invoice
paymentMethod: PaymentMethod!
}
` `

View File

@ -19,7 +19,6 @@ export default gql`
time: Date! time: Date!
sources: [NameValue!]! sources: [NameValue!]!
leaderboard: UsersNullable leaderboard: UsersNullable
ad: Item
} }
type Reward { type Reward {

View File

@ -16,8 +16,7 @@ export default gql`
extend type Mutation { extend type Mutation {
upsertSub(oldName: String, name: String!, desc: String, baseCost: Int!, upsertSub(oldName: String, name: String!, desc: String, baseCost: Int!,
replyCost: Int!, postTypes: [String!]!, allowFreebies: Boolean!,
postTypes: [String!]!,
billingType: String!, billingAutoRenew: Boolean!, billingType: String!, billingAutoRenew: Boolean!,
moderated: Boolean!, nsfw: Boolean!): SubPaidAction! moderated: Boolean!, nsfw: Boolean!): SubPaidAction!
paySub(name: String!): SubPaidAction! paySub(name: String!): SubPaidAction!
@ -25,13 +24,13 @@ export default gql`
toggleSubSubscription(name: String!): Boolean! toggleSubSubscription(name: String!): Boolean!
transferTerritory(subName: String!, userName: String!): Sub transferTerritory(subName: String!, userName: String!): Sub
unarchiveTerritory(name: String!, desc: String, baseCost: Int!, unarchiveTerritory(name: String!, desc: String, baseCost: Int!,
replyCost: Int!, postTypes: [String!]!, postTypes: [String!]!, allowFreebies: Boolean!,
billingType: String!, billingAutoRenew: Boolean!, billingType: String!, billingAutoRenew: Boolean!,
moderated: Boolean!, nsfw: Boolean!): SubPaidAction! moderated: Boolean!, nsfw: Boolean!): SubPaidAction!
} }
type Sub { type Sub {
name: String! name: ID!
createdAt: Date! createdAt: Date!
userId: Int! userId: Int!
user: User! user: User!
@ -46,7 +45,6 @@ export default gql`
billedLastAt: Date! billedLastAt: Date!
billPaidUntil: Date billPaidUntil: Date
baseCost: Int! baseCost: Int!
replyCost: Int!
status: String! status: String!
moderated: Boolean! moderated: Boolean!
moderatedCount: Int! moderatedCount: Int!

View File

@ -1,26 +1,12 @@
import { gql } from 'graphql-tag' import { gql } from 'graphql-tag'
export default gql` export default gql`
type UploadFees { extend type Mutation {
totalFees: Int! getSignedPOST(type: String!, size: Int!, width: Int!, height: Int!, avatar: Boolean): SignedPost!
totalFeesMsats: Int!
uploadFees: Int!
uploadFeesMsats: Int!
nUnpaid: Int!
bytesUnpaid: Int!
bytes24h: Int!
} }
type SignedPost { type SignedPost {
url: String! url: String!
fields: JSONObject! fields: JSONObject!
} }
extend type Query {
uploadFees(s3Keys: [Int]!): UploadFees!
}
extend type Mutation {
getSignedPOST(type: String!, size: Int!, width: Int!, height: Int!, avatar: Boolean): SignedPost!
}
` `

View File

@ -4,7 +4,7 @@ export default gql`
extend type Query { extend type Query {
me: User me: User
settings: User settings: User
user(id: ID, name: String): User user(name: String!): User
users: [User!] users: [User!]
nameAvailable(name: String!): Boolean! nameAvailable(name: String!): Boolean!
topUsers(cursor: String, when: String, from: String, to: String, by: String, limit: Limit): UsersNullable! topUsers(cursor: String, when: String, from: String, to: String, by: String, limit: Limit): UsersNullable!
@ -33,7 +33,7 @@ export default gql`
setName(name: String!): String setName(name: String!): String
setSettings(settings: SettingsInput!): User setSettings(settings: SettingsInput!): User
setPhoto(photoId: ID!): Int! setPhoto(photoId: ID!): Int!
upsertBio(text: String!): ItemPaidAction! upsertBio(bio: String!): User!
setWalkthrough(tipPopover: Boolean, upvotePopover: Boolean): Boolean setWalkthrough(tipPopover: Boolean, upvotePopover: Boolean): Boolean
unlinkAuth(authType: String!): AuthMethods! unlinkAuth(authType: String!): AuthMethods!
linkUnverifiedEmail(email: String!): Boolean linkUnverifiedEmail(email: String!): Boolean
@ -43,13 +43,12 @@ export default gql`
toggleMute(id: ID): User toggleMute(id: ID): User
generateApiKey(id: ID!): String generateApiKey(id: ID!): String
deleteApiKey(id: ID!): User deleteApiKey(id: ID!): User
disableFreebies: Boolean
} }
type User { type User {
id: ID! id: ID!
createdAt: Date! createdAt: Date!
name: String! name: String
nitems(when: String, from: String, to: String): Int! nitems(when: String, from: String, to: String): Int!
nposts(when: String, from: String, to: String): Int! nposts(when: String, from: String, to: String): Int!
nterritories(when: String, from: String, to: String): Int! nterritories(when: String, from: String, to: String): Int!
@ -59,11 +58,6 @@ export default gql`
photoId: Int photoId: Int
since: Int since: Int
"""
this is only returned when we sort stackers by value
"""
proportion: Float
optional: UserOptional! optional: UserOptional!
privates: UserPrivates privates: UserPrivates
@ -77,8 +71,7 @@ export default gql`
diagnostics: Boolean! diagnostics: Boolean!
noReferralLinks: Boolean! noReferralLinks: Boolean!
fiatCurrency: String! fiatCurrency: String!
satsFilter: Int! greeterMode: Boolean!
disableFreebies: Boolean
hideBookmarks: Boolean! hideBookmarks: Boolean!
hideCowboyHat: Boolean! hideCowboyHat: Boolean!
hideGithub: Boolean! hideGithub: Boolean!
@ -89,7 +82,6 @@ export default gql`
hideIsContributor: Boolean! hideIsContributor: Boolean!
hideWalletBalance: Boolean! hideWalletBalance: Boolean!
imgproxyOnly: Boolean! imgproxyOnly: Boolean!
showImagesAndVideos: Boolean!
nostrCrossposting: Boolean! nostrCrossposting: Boolean!
nostrPubkey: String nostrPubkey: String
nostrRelays: [String!] nostrRelays: [String!]
@ -106,16 +98,10 @@ export default gql`
noteItemMentions: Boolean! noteItemMentions: Boolean!
nsfwMode: Boolean! nsfwMode: Boolean!
tipDefault: Int! tipDefault: Int!
tipRandomMin: Int
tipRandomMax: Int
turboTipping: Boolean! turboTipping: Boolean!
zapUndos: Int zapUndos: Int
wildWestMode: Boolean! wildWestMode: Boolean!
withdrawMaxFeeDefault: Int! withdrawMaxFeeDefault: Int!
proxyReceive: Boolean
directReceive: Boolean
receiveCreditsBelowSats: Int!
sendCreditsBelowSats: Int!
} }
type AuthMethods { type AuthMethods {
@ -132,7 +118,6 @@ export default gql`
extremely sensitive extremely sensitive
""" """
sats: Int! sats: Int!
credits: Int!
authMethods: AuthMethods! authMethods: AuthMethods!
lnAddr: String lnAddr: String
@ -153,8 +138,6 @@ export default gql`
diagnostics: Boolean! diagnostics: Boolean!
noReferralLinks: Boolean! noReferralLinks: Boolean!
fiatCurrency: String! fiatCurrency: String!
satsFilter: Int!
disableFreebies: Boolean
greeterMode: Boolean! greeterMode: Boolean!
hideBookmarks: Boolean! hideBookmarks: Boolean!
hideCowboyHat: Boolean! hideCowboyHat: Boolean!
@ -166,7 +149,6 @@ export default gql`
hideIsContributor: Boolean! hideIsContributor: Boolean!
hideWalletBalance: Boolean! hideWalletBalance: Boolean!
imgproxyOnly: Boolean! imgproxyOnly: Boolean!
showImagesAndVideos: Boolean!
nostrCrossposting: Boolean! nostrCrossposting: Boolean!
nostrPubkey: String nostrPubkey: String
nostrRelays: [String!] nostrRelays: [String!]
@ -183,22 +165,12 @@ export default gql`
noteItemMentions: Boolean! noteItemMentions: Boolean!
nsfwMode: Boolean! nsfwMode: Boolean!
tipDefault: Int! tipDefault: Int!
tipRandom: Boolean!
tipRandomMin: Int
tipRandomMax: Int
turboTipping: Boolean! turboTipping: Boolean!
zapUndos: Int zapUndos: Int
wildWestMode: Boolean! wildWestMode: Boolean!
withdrawMaxFeeDefault: Int! withdrawMaxFeeDefault: Int!
autoWithdrawThreshold: Int autoWithdrawThreshold: Int
autoWithdrawMaxFeePercent: Float autoWithdrawMaxFeePercent: Float
autoWithdrawMaxFeeTotal: Int
vaultKeyHash: String
walletsUpdatedAt: Date
proxyReceive: Boolean
directReceive: Boolean
receiveCreditsBelowSats: Int!
sendCreditsBelowSats: Int!
} }
type UserOptional { type UserOptional {
@ -209,15 +181,13 @@ export default gql`
spent(when: String, from: String, to: String): Int spent(when: String, from: String, to: String): Int
referrals(when: String, from: String, to: String): Int referrals(when: String, from: String, to: String): Int
streak: Int streak: Int
gunStreak: Int
horseStreak: Int
maxStreak: Int maxStreak: Int
isContributor: Boolean isContributor: Boolean
githubId: String githubId: String
twitterId: String twitterId: String
nostrAuthPubkey: String nostrAuthPubkey: String
} }
type NameValue { type NameValue {
name: String! name: String!
value: Float! value: Float!

View File

@ -1,29 +0,0 @@
import { gql } from 'graphql-tag'
export default gql`
type VaultEntry {
id: ID!
key: String!
iv: String!
value: String!
createdAt: Date!
updatedAt: Date!
}
input VaultEntryInput {
key: String!
iv: String!
value: String!
walletId: ID
}
extend type Query {
getVaultEntry(key: String!): VaultEntry
getVaultEntries(keysFilter: [String!]): [VaultEntry!]!
}
extend type Mutation {
clearVault: Boolean
updateVaultKey(entries: [VaultEntryInput!]!, hash: String!): Boolean
}
`

View File

@ -1,125 +1,95 @@
import { gql } from 'graphql-tag' import { gql } from 'graphql-tag'
import { fieldToGqlArg, fieldToGqlArgOptional, generateResolverName, generateTypeDefName } from '@/wallets/graphql' import { generateResolverName } from '@/lib/wallet'
import { isServerField } from '@/wallets/common'
import walletDefs from '@/wallets/server' import walletDefs from 'wallets/server'
function injectTypeDefs (typeDefs) { function injectTypeDefs (typeDefs) {
const injected = [rawTypeDefs(), mutationTypeDefs()]
return `${typeDefs}\n\n${injected.join('\n\n')}\n`
}
function mutationTypeDefs () {
console.group('injected GraphQL mutations:')
const typeDefs = walletDefs.map((w) => {
let args = 'id: ID, '
const serverFields = w.fields
.filter(isServerField)
.map(fieldToGqlArgOptional)
if (serverFields.length > 0) args += serverFields.join(', ') + ','
args += 'enabled: Boolean, priority: Int, vaultEntries: [VaultEntryInput!], settings: AutowithdrawSettings, validateLightning: Boolean'
const resolverName = generateResolverName(w.walletField)
const typeDef = `${resolverName}(${args}): Wallet`
console.log(typeDef)
return typeDef
})
console.groupEnd()
return `extend type Mutation {\n${typeDefs.join('\n')}\n}`
}
function rawTypeDefs () {
console.group('injected GraphQL type defs:') console.group('injected GraphQL type defs:')
const injected = walletDefs.map(
const typeDefs = walletDefs.map((w) => { (w) => {
let args = w.fields let args = 'id: ID, '
.filter(isServerField) args += w.fields.map(f => {
.map(fieldToGqlArg) let arg = `${f.name}: String`
.map(s => ' ' + s) if (!f.optional) {
.join('\n') arg += '!'
if (!args) { }
// add a placeholder arg so the type is not empty return arg
args = ' _empty: Boolean' }).join(', ')
} args += ', settings: AutowithdrawSettings!'
const typeDefName = generateTypeDefName(w.walletType) const resolverName = generateResolverName(w.walletField)
const typeDef = `type ${typeDefName} {\n${args}\n}` const typeDef = `${resolverName}(${args}): Boolean`
console.log(typeDef) console.log(typeDef)
return typeDef return typeDef
}) })
let union = 'union WalletDetails = '
union += walletDefs.map((w) => {
const typeDefName = generateTypeDefName(w.walletType)
return typeDefName
}).join(' | ')
console.log(union)
console.groupEnd() console.groupEnd()
return typeDefs.join('\n\n') + union return `${typeDefs}\n\nextend type Mutation {\n${injected.join('\n')}\n}`
} }
const typeDefs = ` const typeDefs = `
extend type Query { extend type Query {
invoice(id: ID!): Invoice! invoice(id: ID!): Invoice!
withdrawl(id: ID!): Withdrawl! withdrawl(id: ID!): Withdrawl!
direct(id: ID!): Direct!
numBolt11s: Int! numBolt11s: Int!
connectAddress: String! connectAddress: String!
walletHistory(cursor: String, inc: String): History walletHistory(cursor: String, inc: String): History
wallets(includeReceivers: Boolean, includeSenders: Boolean, onlyEnabled: Boolean, prioritySort: String): [Wallet!]! wallets: [Wallet!]!
wallet(id: ID!): Wallet wallet(id: ID!): Wallet
walletByType(type: String!): Wallet walletByType(type: String!): Wallet
walletLogs(type: String, from: String, to: String, cursor: String): WalletLog! walletLogs: [WalletLog]!
failedInvoices: [Invoice!]!
} }
extend type Mutation { extend type Mutation {
createInvoice(amount: Int!): InvoiceOrDirect! createInvoice(amount: Int!, expireSecs: Int, hodlInvoice: Boolean): Invoice!
createWithdrawl(invoice: String!, maxFee: Int!): Withdrawl! createWithdrawl(invoice: String!, maxFee: Int!): Withdrawl!
sendToLnAddr(addr: String!, amount: Int!, maxFee: Int!, comment: String, identifier: Boolean, name: String, email: String): Withdrawl! sendToLnAddr(addr: String!, amount: Int!, maxFee: Int!, comment: String, identifier: Boolean, name: String, email: String): Withdrawl!
cancelInvoice(hash: String!, hmac: String, userCancel: Boolean): Invoice! cancelInvoice(hash: String!, hmac: String!): Invoice!
dropBolt11(hash: String!): Boolean dropBolt11(id: ID): Withdrawl
removeWallet(id: ID!): Boolean removeWallet(id: ID!): Boolean
deleteWalletLogs(wallet: String): Boolean deleteWalletLogs(wallet: String): Boolean
setWalletPriority(id: ID!, priority: Int!): Boolean
buyCredits(credits: Int!): BuyCreditsPaidAction!
}
type BuyCreditsResult {
credits: Int!
}
interface InvoiceOrDirect {
id: ID!
} }
type Wallet { type Wallet {
id: ID! id: ID!
createdAt: Date! createdAt: Date!
updatedAt: Date!
type: String! type: String!
enabled: Boolean! enabled: Boolean!
priority: Int! priority: Int!
wallet: WalletDetails! wallet: WalletDetails!
vaultEntries: [VaultEntry!]!
} }
type WalletLNAddr {
address: String!
}
type WalletLND {
socket: String!
macaroon: String!
cert: String
}
type WalletCLN {
socket: String!
rune: String!
cert: String
}
union WalletDetails = WalletLNAddr | WalletLND | WalletCLN
input AutowithdrawSettings { input AutowithdrawSettings {
autoWithdrawThreshold: Int! autoWithdrawThreshold: Int!
autoWithdrawMaxFeePercent: Float! autoWithdrawMaxFeePercent: Float!
autoWithdrawMaxFeeTotal: Int! priority: Int
enabled: Boolean
} }
type Invoice implements InvoiceOrDirect { type Invoice {
id: ID! id: ID!
createdAt: Date! createdAt: Date!
hash: String! hash: String!
bolt11: String! bolt11: String!
expiresAt: Date! expiresAt: Date!
cancelled: Boolean! cancelled: Boolean!
cancelledAt: Date
confirmedAt: Date confirmedAt: Date
satsReceived: Int satsReceived: Int
satsRequested: Int! satsRequested: Int!
@ -132,11 +102,8 @@ const typeDefs = `
actionState: String actionState: String
actionType: String actionType: String
actionError: String actionError: String
invoiceForward: Boolean
item: Item item: Item
itemAct: ItemAct itemAct: ItemAct
forwardedSats: Int
forwardStatus: String
} }
type Withdrawl { type Withdrawl {
@ -151,19 +118,6 @@ const typeDefs = `
status: String status: String
autoWithdraw: Boolean! autoWithdraw: Boolean!
preimage: String preimage: String
forwardedActionType: String
}
type Direct implements InvoiceOrDirect {
id: ID!
createdAt: Date!
bolt11: String
hash: String
sats: Int
preimage: String
nostr: JSONObject
comment: String
lud18Data: JSONObject
} }
type Fact { type Fact {
@ -187,17 +141,11 @@ const typeDefs = `
} }
type WalletLog { type WalletLog {
entries: [WalletLogEntry!]!
cursor: String
}
type WalletLogEntry {
id: ID! id: ID!
createdAt: Date! createdAt: Date!
wallet: ID! wallet: ID!
level: String! level: String!
message: String! message: String!
context: JSONObject
} }
` `

View File

@ -18,7 +18,7 @@ felipebueno,pr,#948,,,,,,100k,felipe@stacker.news,2024-03-26
benalleng,pr,#972,#923,good-first-issue,,,,20k,BenAllenG@stacker.news,2024-03-26 benalleng,pr,#972,#923,good-first-issue,,,,20k,BenAllenG@stacker.news,2024-03-26
SatsAllDay,issue,#972,#923,good-first-issue,,,,2k,weareallsatoshi@getalby.com,2024-03-26 SatsAllDay,issue,#972,#923,good-first-issue,,,,2k,weareallsatoshi@getalby.com,2024-03-26
felipebueno,pr,#974,#884,good-first-issue,,,,20k,felipe@stacker.news,2024-03-26 felipebueno,pr,#974,#884,good-first-issue,,,,20k,felipe@stacker.news,2024-03-26
h0dlr,issue,#974,#884,good-first-issue,,,,2k,HODLR@stacker.news,2024-04-04 h0dlr,issue,#974,#884,good-first-issue,,,,2k,0xe14b9b5981c729a3@ln.tips,2024-04-04
benalleng,pr,#975,,,,,,20k,BenAllenG@stacker.news,2024-03-26 benalleng,pr,#975,,,,,,20k,BenAllenG@stacker.news,2024-03-26
SatsAllDay,security,#980,GHSA-qg4g-m4xq-695p,,,,,100k,weareallsatoshi@getalby.com,2024-03-28 SatsAllDay,security,#980,GHSA-qg4g-m4xq-695p,,,,,100k,weareallsatoshi@getalby.com,2024-03-28
SatsAllDay,code review,#980,GHSA-qg4g-m4xq-695p,medium,,,,25k,weareallsatoshi@getalby.com,2024-03-28 SatsAllDay,code review,#980,GHSA-qg4g-m4xq-695p,medium,,,,25k,weareallsatoshi@getalby.com,2024-03-28
@ -115,93 +115,3 @@ cointastical,issue,#1223,#107,medium,,2,,20k,cointastical@stacker.news,2024-06-2
kravhen,pr,#1215,#253,medium,,2,upgraded to medium,200k,nichro@getalby.com,2024-06-28 kravhen,pr,#1215,#253,medium,,2,upgraded to medium,200k,nichro@getalby.com,2024-06-28
dillon-co,pr,#1140,#633,hard,,,requested advance,500k,bolt11,2024-07-02 dillon-co,pr,#1140,#633,hard,,,requested advance,500k,bolt11,2024-07-02
takitakitanana,issue,,#1257,good-first-issue,,,,2k,takitakitanana@stacker.news,2024-07-11 takitakitanana,issue,,#1257,good-first-issue,,,,2k,takitakitanana@stacker.news,2024-07-11
SatsAllDay,pr,#1263,#1112,medium,,,1,225k,weareallsatoshi@getalby.com,2024-07-31
OneOneSeven117,issue,#1272,#1268,easy,,,,10k,OneOneSeven@stacker.news,2024-07-31
aniskhalfallah,pr,#1264,#1226,good-first-issue,,,,20k,aniskhalfallah@stacker.news,2024-07-31
Gudnessuche,issue,#1264,#1226,good-first-issue,,,,2k,everythingsatoshi@getalby.com,2024-08-10
aniskhalfallah,pr,#1289,,easy,,,,100k,aniskhalfallah@blink.sv,2024-08-12
riccardobl,pr,#1293,#1142,medium,high,,,500k,rblb@getalby.com,2024-08-18
tsmith123,pr,#1306,#832,medium,,,,250k,stickymarch60@walletofsatoshi.com,2024-08-20
riccardobl,pr,#1311,#864,medium,high,,pending unrelated refactor,500k,rblb@getalby.com,2024-08-27
brugeman,issue,#1311,#864,medium,high,,,50k,brugeman@stacker.news,2024-08-27
riccardobl,pr,#1342,#1141,hard,high,,pending unrelated rearchitecture,1m,rblb@getalby.com,2024-09-09
SatsAllDay,issue,#1368,#1331,medium,,,,25k,weareallsatoshi@getalby.com,2024-09-16
benalleng,helpfulness,#1368,#1170,medium,,,did a lot of it in #1175,25k,BenAllenG@stacker.news,2024-09-16
humble-GOAT,issue,#1412,#1407,good-first-issue,,,,2k,humble_GOAT@stacker.news,2024-09-18
felipebueno,issue,#1425,#986,medium,,,,25k,felipebueno@getalby.com,2024-09-26
riccardobl,pr,#1373,#1304,hard,high,,,2m,bolt11,2024-10-01
tsmith123,pr,#1428,#1397,easy,,1,superceded,90k,stickymarch60@walletofsatoshi.com,2024-10-02
toyota-corolla0,pr,#1449,,good-first-issue,,,,20k,toyota_corolla0@stacker.news,2024-10-02
toyota-corolla0,pr,#1455,#1437,good-first-issue,,,,20k,toyota_corolla0@stacker.news,2024-10-02
SouthKoreaLN,issue,#1436,,easy,,,,10k,south_korea_ln@stacker.news,2024-10-02
TonyGiorgio,issue,#1462,,easy,urgent,,,30k,TonyGiorgio@stacker.news,2024-10-07
hkarani,issue,#1369,#1458,good-first-issue,,,,2k,asterisk32@stacker.news,2024-10-21
toyota-corolla0,pr,#1369,#1458,good-first-issue,,,,20k,toyota_corolla0@stacker.news,2024-10-20
Soxasora,pr,#1593,#1569,good-first-issue,,,,20k,soxasora@blink.sv,2024-11-19
Soxasora,pr,#1599,#1258,medium,,,,250k,soxasora@blink.sv,2024-11-19
aegroto,pr,#1585,#1522,easy,high,,1,180k,aegroto@blink.sv,2024-11-19
sig47,issue,#1585,#1522,easy,high,,1,18k,siggy47@stacker.news,2024-11-19
aegroto,pr,#1583,#1572,easy,,,2,80k,aegroto@blink.sv,2024-11-19
Soxasora,pr,#1617,#1616,easy,,,,100k,soxasora@blink.sv,2024-11-20
Soxasora,issue,#1617,#1616,easy,,,,10k,soxasora@blink.sv,2024-11-20
AndreaDiazCorreia,helpfulness,#1605,#1566,good-first-issue,,,tried in pr,2k,andrea@lawallet.ar,2024-11-20
Soxasora,pr,#1653,,medium,,,determined unecessary,250k,soxasora@blink.sv,2024-12-07
Soxasora,pr,#1659,#1657,easy,,,,100k,soxasora@blink.sv,2024-12-07
sig47,issue,#1659,#1657,easy,,,,10k,siggy47@stacker.news,2024-12-07
Gudnessuche,issue,#1662,#1661,good-first-issue,,,,2k,everythingsatoshi@getalby.com,2024-12-07
aegroto,pr,#1589,#1586,easy,,,,100k,aegroto@blink.sv,2024-12-07
aegroto,issue,#1589,#1586,easy,,,,10k,aegroto@blink.sv,2024-12-07
aegroto,pr,#1619,#914,easy,,,,100k,aegroto@blink.sv,2024-12-07
felipebueno,pr,#1620,,medium,,,1,225k,felipebueno@getalby.com,2024-12-09
Soxasora,pr,#1647,#1645,easy,,,,100k,soxasora@blink.sv,2024-12-07
Soxasora,pr,#1667,#1568,easy,,,,100k,soxasora@blink.sv,2024-12-07
aegroto,pr,#1633,#1471,easy,,,1,90k,aegroto@blink.sv,2024-12-07
Darth-Coin,issue,#1649,#1421,medium,,,,25k,darthcoin@stacker.news,2024-12-07
Soxasora,pr,#1685,,medium,,,,250k,soxasora@blink.sv,2024-12-07
aegroto,pr,#1606,#1242,medium,,,,250k,aegroto@blink.sv,2024-12-07
sfr0xyz,issue,#1696,#1196,good-first-issue,,,,2k,sefiro@getalby.com,2024-12-10
Soxasora,pr,#1794,#756,hard,urgent,,includes #411,3m,bolt11,2025-01-09
Soxasora,pr,#1786,#363,easy,,,,100k,bolt11,2025-01-09
Soxasora,pr,#1768,#1186,medium-hard,,,,500k,bolt11,2025-01-09
Soxasora,pr,#1750,#1035,medium,,,,250k,bolt11,2025-01-09
SatsAllDay,issue,#1794,#411,hard,high,,,200k,weareallsatoshi@getalby.com,2025-01-20
felipebueno,issue,#1786,#363,easy,,,,10k,felipebueno@blink.sv,2025-01-27
cyphercosmo,pr,#1745,#1648,good-first-issue,,,2,16k,cyphercosmo@getalby.com,2025-01-27
Radentor,issue,#1768,#1186,medium-hard,,,,50k,revisedbird84@walletofsatoshi.com,2025-01-27
Soxasora,pr,#1841,#1692,good-first-issue,,,,20k,soxasora@blink.sv,2025-01-27
Soxasora,pr,#1839,#1790,easy,,,1,90k,soxasora@blink.sv,2025-01-27
Soxasora,pr,#1820,#1819,easy,,,1,90k,soxasora@blink.sv,2025-01-27
SatsAllDay,issue,#1820,#1819,easy,,,1,9k,weareallsatoshi@getalby.com,2025-01-27
Soxasora,pr,#1814,#1736,easy,,,,100k,soxasora@blink.sv,2025-01-27
jason-me,pr,#1857,,easy,,,,100k,rrbtc@vlt.ge,2025-02-08
ed-kung,pr,#1901,#323,good-first-issue,,,,20k,simplestacker@getalby.com,2025-02-14
Scroogey-SN,pr,#1911,#1905,good-first-issue,,,1,18k,Scroogey@coinos.io,2025-03-10
Scroogey-SN,pr,#1928,#1924,good-first-issue,,,,20k,Scroogey@coinos.io,2025-03-10
dtonon,issue,#1928,#1924,good-first-issue,,,,2k,???,???
ed-kung,pr,#1926,#1914,medium-hard,,,,500k,simplestacker@getalby.com,2025-03-10
ed-kung,issue,#1926,#1914,medium-hard,,,,50k,simplestacker@getalby.com,2025-03-10
ed-kung,pr,#1926,#1927,easy,,,,100k,simplestacker@getalby.com,2025-03-10
ed-kung,issue,#1926,#1927,easy,,,,10k,simplestacker@getalby.com,2025-03-10
ed-kung,issue,#1913,#1890,good-first-issue,,,,2k,simplestacker@getalby.com,2025-03-10
Scroogey-SN,pr,#1930,#1167,good-first-issue,,,,20k,Scroogey@coinos.io,2025-03-10
itsrealfake,issue,#1930,#1167,good-first-issue,,,,2k,smallimagination100035@getalby.com,???
Scroogey-SN,pr,#1948,#1849,medium,urgent,,,750k,Scroogey@coinos.io,2025-03-10
felipebueno,issue,#1947,#1945,good-first-issue,,,,2k,felipebueno@blink.sv,2025-03-10
ed-kung,pr,#1952,#1951,easy,,,,100k,simplestacker@getalby.com,2025-03-10
ed-kung,issue,#1952,#1951,easy,,,,10k,simplestacker@getalby.com,2025-03-10
Scroogey-SN,pr,#1973,#1959,good-first-issue,,,,20k,Scroogey@coinos.io,???
benthecarman,issue,#1953,#1950,good-first-issue,,,,2k,???,???
ed-kung,pr,#2012,#2004,easy,,,,100k,simplestacker@getalby.com,???
ed-kung,issue,#2012,#2004,easy,,,,10k,simplestacker@getalby.com,???
ed-kung,pr,#1993,#1982,good-first-issue,,,,20k,simplestacker@getalby.com,???
rideandslide,issue,#1993,#1982,good-first-issue,,,,2k,???,???
ed-kung,pr,#1972,#1254,good-first-issue,,,,20k,simplestacker@getalby.com,???
SatsAllDay,issue,#1972,#1254,good-first-issue,,,,2k,weareallsatoshi@getalby.com,???
ed-kung,pr,#1962,#1343,good-first-issue,,,,20k,simplestacker@getalby.com,???
ed-kung,pr,#1962,#1217,good-first-issue,,,,20k,simplestacker@getalby.com,???
ed-kung,pr,#1962,#866,easy,,,,100k,simplestacker@getalby.com,???
felipebueno,issue,#1962,#866,easy,,,,10k,felipebueno@blink.sv,???
cointastical,issue,#1962,#1217,good-first-issue,,,,2k,cointastical@stacker.news,???
Scroogey-SN,pr,#1975,#1964,good-first-issue,,,,20k,Scroogey@coinos.io,???
rideandslide,issue,#1986,#1985,good-first-issue,,,,2k,???,???
kristapsk,issue,#1976,#841,good-first-issue,,,,2k,???,???

1 name type pr id issue ids difficulty priority changes requested notes amount receive method date paid
18 benalleng pr #972 #923 good-first-issue 20k BenAllenG@stacker.news 2024-03-26
19 SatsAllDay issue #972 #923 good-first-issue 2k weareallsatoshi@getalby.com 2024-03-26
20 felipebueno pr #974 #884 good-first-issue 20k felipe@stacker.news 2024-03-26
21 h0dlr issue #974 #884 good-first-issue 2k HODLR@stacker.news 0xe14b9b5981c729a3@ln.tips 2024-04-04
22 benalleng pr #975 20k BenAllenG@stacker.news 2024-03-26
23 SatsAllDay security #980 GHSA-qg4g-m4xq-695p 100k weareallsatoshi@getalby.com 2024-03-28
24 SatsAllDay code review #980 GHSA-qg4g-m4xq-695p medium 25k weareallsatoshi@getalby.com 2024-03-28
115 kravhen pr #1215 #253 medium 2 upgraded to medium 200k nichro@getalby.com 2024-06-28
116 dillon-co pr #1140 #633 hard requested advance 500k bolt11 2024-07-02
117 takitakitanana issue #1257 good-first-issue 2k takitakitanana@stacker.news 2024-07-11
SatsAllDay pr #1263 #1112 medium 1 225k weareallsatoshi@getalby.com 2024-07-31
OneOneSeven117 issue #1272 #1268 easy 10k OneOneSeven@stacker.news 2024-07-31
aniskhalfallah pr #1264 #1226 good-first-issue 20k aniskhalfallah@stacker.news 2024-07-31
Gudnessuche issue #1264 #1226 good-first-issue 2k everythingsatoshi@getalby.com 2024-08-10
aniskhalfallah pr #1289 easy 100k aniskhalfallah@blink.sv 2024-08-12
riccardobl pr #1293 #1142 medium high 500k rblb@getalby.com 2024-08-18
tsmith123 pr #1306 #832 medium 250k stickymarch60@walletofsatoshi.com 2024-08-20
riccardobl pr #1311 #864 medium high pending unrelated refactor 500k rblb@getalby.com 2024-08-27
brugeman issue #1311 #864 medium high 50k brugeman@stacker.news 2024-08-27
riccardobl pr #1342 #1141 hard high pending unrelated rearchitecture 1m rblb@getalby.com 2024-09-09
SatsAllDay issue #1368 #1331 medium 25k weareallsatoshi@getalby.com 2024-09-16
benalleng helpfulness #1368 #1170 medium did a lot of it in #1175 25k BenAllenG@stacker.news 2024-09-16
humble-GOAT issue #1412 #1407 good-first-issue 2k humble_GOAT@stacker.news 2024-09-18
felipebueno issue #1425 #986 medium 25k felipebueno@getalby.com 2024-09-26
riccardobl pr #1373 #1304 hard high 2m bolt11 2024-10-01
tsmith123 pr #1428 #1397 easy 1 superceded 90k stickymarch60@walletofsatoshi.com 2024-10-02
toyota-corolla0 pr #1449 good-first-issue 20k toyota_corolla0@stacker.news 2024-10-02
toyota-corolla0 pr #1455 #1437 good-first-issue 20k toyota_corolla0@stacker.news 2024-10-02
SouthKoreaLN issue #1436 easy 10k south_korea_ln@stacker.news 2024-10-02
TonyGiorgio issue #1462 easy urgent 30k TonyGiorgio@stacker.news 2024-10-07
hkarani issue #1369 #1458 good-first-issue 2k asterisk32@stacker.news 2024-10-21
toyota-corolla0 pr #1369 #1458 good-first-issue 20k toyota_corolla0@stacker.news 2024-10-20
Soxasora pr #1593 #1569 good-first-issue 20k soxasora@blink.sv 2024-11-19
Soxasora pr #1599 #1258 medium 250k soxasora@blink.sv 2024-11-19
aegroto pr #1585 #1522 easy high 1 180k aegroto@blink.sv 2024-11-19
sig47 issue #1585 #1522 easy high 1 18k siggy47@stacker.news 2024-11-19
aegroto pr #1583 #1572 easy 2 80k aegroto@blink.sv 2024-11-19
Soxasora pr #1617 #1616 easy 100k soxasora@blink.sv 2024-11-20
Soxasora issue #1617 #1616 easy 10k soxasora@blink.sv 2024-11-20
AndreaDiazCorreia helpfulness #1605 #1566 good-first-issue tried in pr 2k andrea@lawallet.ar 2024-11-20
Soxasora pr #1653 medium determined unecessary 250k soxasora@blink.sv 2024-12-07
Soxasora pr #1659 #1657 easy 100k soxasora@blink.sv 2024-12-07
sig47 issue #1659 #1657 easy 10k siggy47@stacker.news 2024-12-07
Gudnessuche issue #1662 #1661 good-first-issue 2k everythingsatoshi@getalby.com 2024-12-07
aegroto pr #1589 #1586 easy 100k aegroto@blink.sv 2024-12-07
aegroto issue #1589 #1586 easy 10k aegroto@blink.sv 2024-12-07
aegroto pr #1619 #914 easy 100k aegroto@blink.sv 2024-12-07
felipebueno pr #1620 medium 1 225k felipebueno@getalby.com 2024-12-09
Soxasora pr #1647 #1645 easy 100k soxasora@blink.sv 2024-12-07
Soxasora pr #1667 #1568 easy 100k soxasora@blink.sv 2024-12-07
aegroto pr #1633 #1471 easy 1 90k aegroto@blink.sv 2024-12-07
Darth-Coin issue #1649 #1421 medium 25k darthcoin@stacker.news 2024-12-07
Soxasora pr #1685 medium 250k soxasora@blink.sv 2024-12-07
aegroto pr #1606 #1242 medium 250k aegroto@blink.sv 2024-12-07
sfr0xyz issue #1696 #1196 good-first-issue 2k sefiro@getalby.com 2024-12-10
Soxasora pr #1794 #756 hard urgent includes #411 3m bolt11 2025-01-09
Soxasora pr #1786 #363 easy 100k bolt11 2025-01-09
Soxasora pr #1768 #1186 medium-hard 500k bolt11 2025-01-09
Soxasora pr #1750 #1035 medium 250k bolt11 2025-01-09
SatsAllDay issue #1794 #411 hard high 200k weareallsatoshi@getalby.com 2025-01-20
felipebueno issue #1786 #363 easy 10k felipebueno@blink.sv 2025-01-27
cyphercosmo pr #1745 #1648 good-first-issue 2 16k cyphercosmo@getalby.com 2025-01-27
Radentor issue #1768 #1186 medium-hard 50k revisedbird84@walletofsatoshi.com 2025-01-27
Soxasora pr #1841 #1692 good-first-issue 20k soxasora@blink.sv 2025-01-27
Soxasora pr #1839 #1790 easy 1 90k soxasora@blink.sv 2025-01-27
Soxasora pr #1820 #1819 easy 1 90k soxasora@blink.sv 2025-01-27
SatsAllDay issue #1820 #1819 easy 1 9k weareallsatoshi@getalby.com 2025-01-27
Soxasora pr #1814 #1736 easy 100k soxasora@blink.sv 2025-01-27
jason-me pr #1857 easy 100k rrbtc@vlt.ge 2025-02-08
ed-kung pr #1901 #323 good-first-issue 20k simplestacker@getalby.com 2025-02-14
Scroogey-SN pr #1911 #1905 good-first-issue 1 18k Scroogey@coinos.io 2025-03-10
Scroogey-SN pr #1928 #1924 good-first-issue 20k Scroogey@coinos.io 2025-03-10
dtonon issue #1928 #1924 good-first-issue 2k ??? ???
ed-kung pr #1926 #1914 medium-hard 500k simplestacker@getalby.com 2025-03-10
ed-kung issue #1926 #1914 medium-hard 50k simplestacker@getalby.com 2025-03-10
ed-kung pr #1926 #1927 easy 100k simplestacker@getalby.com 2025-03-10
ed-kung issue #1926 #1927 easy 10k simplestacker@getalby.com 2025-03-10
ed-kung issue #1913 #1890 good-first-issue 2k simplestacker@getalby.com 2025-03-10
Scroogey-SN pr #1930 #1167 good-first-issue 20k Scroogey@coinos.io 2025-03-10
itsrealfake issue #1930 #1167 good-first-issue 2k smallimagination100035@getalby.com ???
Scroogey-SN pr #1948 #1849 medium urgent 750k Scroogey@coinos.io 2025-03-10
felipebueno issue #1947 #1945 good-first-issue 2k felipebueno@blink.sv 2025-03-10
ed-kung pr #1952 #1951 easy 100k simplestacker@getalby.com 2025-03-10
ed-kung issue #1952 #1951 easy 10k simplestacker@getalby.com 2025-03-10
Scroogey-SN pr #1973 #1959 good-first-issue 20k Scroogey@coinos.io ???
benthecarman issue #1953 #1950 good-first-issue 2k ??? ???
ed-kung pr #2012 #2004 easy 100k simplestacker@getalby.com ???
ed-kung issue #2012 #2004 easy 10k simplestacker@getalby.com ???
ed-kung pr #1993 #1982 good-first-issue 20k simplestacker@getalby.com ???
rideandslide issue #1993 #1982 good-first-issue 2k ??? ???
ed-kung pr #1972 #1254 good-first-issue 20k simplestacker@getalby.com ???
SatsAllDay issue #1972 #1254 good-first-issue 2k weareallsatoshi@getalby.com ???
ed-kung pr #1962 #1343 good-first-issue 20k simplestacker@getalby.com ???
ed-kung pr #1962 #1217 good-first-issue 20k simplestacker@getalby.com ???
ed-kung pr #1962 #866 easy 100k simplestacker@getalby.com ???
felipebueno issue #1962 #866 easy 10k felipebueno@blink.sv ???
cointastical issue #1962 #1217 good-first-issue 2k cointastical@stacker.news ???
Scroogey-SN pr #1975 #1964 good-first-issue 20k Scroogey@coinos.io ???
rideandslide issue #1986 #1985 good-first-issue 2k ??? ???
kristapsk issue #1976 #841 good-first-issue 2k ??? ???

View File

@ -11,7 +11,7 @@ RUN npm ci
COPY . . COPY . .
ADD https://deb.debian.org/debian/pool/main/f/fonts-noto-color-emoji/fonts-noto-color-emoji_0~20200916-1_all.deb fonts-noto-color-emoji.deb ADD http://ftp.de.debian.org/debian/pool/main/f/fonts-noto-color-emoji/fonts-noto-color-emoji_0~20200916-1_all.deb fonts-noto-color-emoji.deb
RUN dpkg -i fonts-noto-color-emoji.deb RUN dpkg -i fonts-noto-color-emoji.deb
CMD [ "node", "index.js" ] CMD [ "node", "index.js" ]
USER pptruser USER pptruser

View File

@ -4,7 +4,6 @@ import { useAccordionButton } from 'react-bootstrap/AccordionButton'
import ArrowRight from '@/svgs/arrow-right-s-fill.svg' import ArrowRight from '@/svgs/arrow-right-s-fill.svg'
import ArrowDown from '@/svgs/arrow-down-s-fill.svg' import ArrowDown from '@/svgs/arrow-down-s-fill.svg'
import { useContext, useEffect, useState } from 'react' import { useContext, useEffect, useState } from 'react'
import classNames from 'classnames'
const KEY_ID = '0' const KEY_ID = '0'
@ -31,7 +30,7 @@ function ContextAwareToggle ({ children, headerColor = 'var(--theme-grey)', even
) )
} }
export default function AccordianItem ({ header, body, className, headerColor = 'var(--theme-grey)', show }) { export default function AccordianItem ({ header, body, headerColor = 'var(--theme-grey)', show }) {
const [activeKey, setActiveKey] = useState() const [activeKey, setActiveKey] = useState()
useEffect(() => { useEffect(() => {
@ -44,8 +43,8 @@ export default function AccordianItem ({ header, body, className, headerColor =
return ( return (
<Accordion defaultActiveKey={activeKey} activeKey={activeKey} onSelect={handleOnSelect}> <Accordion defaultActiveKey={activeKey} activeKey={activeKey} onSelect={handleOnSelect}>
<ContextAwareToggle show={show} eventKey={KEY_ID} headerColor={headerColor}><div style={{ color: headerColor }}>{header}</div></ContextAwareToggle> <ContextAwareToggle show={show} eventKey={KEY_ID}><div style={{ color: headerColor }}>{header}</div></ContextAwareToggle>
<Accordion.Collapse eventKey={KEY_ID} className={classNames('mt-2', className)}> <Accordion.Collapse eventKey={KEY_ID} className='mt-2'>
<div>{body}</div> <div>{body}</div>
</Accordion.Collapse> </Accordion.Collapse>
</Accordion> </Accordion>

View File

@ -1,177 +0,0 @@
import { createContext, useCallback, useContext, useEffect, useMemo, useState } from 'react'
import { useRouter } from 'next/router'
import * as cookie from 'cookie'
import { useMe } from '@/components/me'
import { USER_ID, SSR } from '@/lib/constants'
import { USER } from '@/fragments/users'
import { useQuery } from '@apollo/client'
import { UserListRow } from '@/components/user-list'
import Link from 'next/link'
import AddIcon from '@/svgs/add-fill.svg'
import { MultiAuthErrorBanner } from '@/components/banners'
import { cookieOptions, MULTI_AUTH_ANON, MULTI_AUTH_LIST, MULTI_AUTH_POINTER } from '@/lib/auth'
const AccountContext = createContext()
const CHECK_ERRORS_INTERVAL_MS = 5_000
const b64Decode = str => Buffer.from(str, 'base64').toString('utf-8')
export const AccountProvider = ({ children }) => {
const [accounts, setAccounts] = useState([])
const [meAnon, setMeAnon] = useState(true)
const [errors, setErrors] = useState([])
const updateAccountsFromCookie = useCallback(() => {
const { [MULTI_AUTH_LIST]: listCookie } = cookie.parse(document.cookie)
const accounts = listCookie
? JSON.parse(b64Decode(listCookie))
: []
setAccounts(accounts)
}, [])
const nextAccount = useCallback(async () => {
const { status } = await fetch('/api/next-account', { credentials: 'include' })
// if status is 302, this means the server was able to switch us to the next available account
// and the current account was simply removed from the list of available accounts including the corresponding JWT.
const switchSuccess = status === 302
if (switchSuccess) updateAccountsFromCookie()
return switchSuccess
}, [updateAccountsFromCookie])
const checkErrors = useCallback(() => {
const {
[MULTI_AUTH_LIST]: listCookie,
[MULTI_AUTH_POINTER]: pointerCookie
} = cookie.parse(document.cookie)
const errors = []
if (!listCookie) errors.push(`${MULTI_AUTH_LIST} cookie not found`)
if (!pointerCookie) errors.push(`${MULTI_AUTH_POINTER} cookie not found`)
setErrors(errors)
}, [])
useEffect(() => {
if (SSR) return
updateAccountsFromCookie()
const { [MULTI_AUTH_POINTER]: pointerCookie } = cookie.parse(document.cookie)
setMeAnon(pointerCookie === 'anonymous')
const interval = setInterval(checkErrors, CHECK_ERRORS_INTERVAL_MS)
return () => clearInterval(interval)
}, [updateAccountsFromCookie, checkErrors])
const value = useMemo(
() => ({
accounts,
meAnon,
setMeAnon,
nextAccount,
multiAuthErrors: errors
}),
[accounts, meAnon, setMeAnon, nextAccount])
return <AccountContext.Provider value={value}>{children}</AccountContext.Provider>
}
export const useAccounts = () => useContext(AccountContext)
const AccountListRow = ({ account, ...props }) => {
const { meAnon, setMeAnon } = useAccounts()
const { me, refreshMe } = useMe()
const anonRow = account.id === USER_ID.anon
const selected = (meAnon && anonRow) || Number(me?.id) === Number(account.id)
const router = useRouter()
// fetch updated names and photo ids since they might have changed since we were issued the JWTs
const { data, error } = useQuery(USER,
{
variables: { id: account.id }
}
)
if (error) console.error(`query for user ${account.id} failed:`, error)
const name = data?.user?.name || account.name
const photoId = data?.user?.photoId || account.photoId
const onClick = async (e) => {
// prevent navigation
e.preventDefault()
// update pointer cookie
const options = cookieOptions({ httpOnly: false })
document.cookie = cookie.serialize(MULTI_AUTH_POINTER, anonRow ? MULTI_AUTH_ANON : account.id, options)
// update state
if (anonRow) {
// order is important to prevent flashes of no session
setMeAnon(true)
await refreshMe()
} else {
await refreshMe()
// order is important to prevent flashes of inconsistent data in switch account dialog
setMeAnon(account.id === USER_ID.anon)
}
// reload whatever page we're on to avoid any bugs due to missing authorization etc.
router.reload()
}
return (
<div className='d-flex flex-row'>
<UserListRow
user={{ ...account, photoId, name }}
className='d-flex align-items-center me-2'
{...props}
onNymClick={onClick}
selected={selected}
/>
</div>
)
}
export default function SwitchAccountList () {
const { accounts, multiAuthErrors } = useAccounts()
const router = useRouter()
const hasError = multiAuthErrors.length > 0
if (hasError) {
return (
<>
<div className='my-2'>
<div className='d-flex flex-column flex-wrap mt-2 mb-3'>
<MultiAuthErrorBanner errors={multiAuthErrors} />
</div>
</div>
</>
)
}
// can't show hat since the streak is not included in the JWT payload
return (
<>
<div className='my-2'>
<div className='d-flex flex-column flex-wrap mt-2 mb-3'>
<h4 className='text-muted'>Accounts</h4>
<AccountListRow account={{ id: USER_ID.anon, name: 'anon' }} showHat={false} />
{
accounts.map((account) => <AccountListRow key={account.id} account={account} showHat={false} />)
}
</div>
<Link
href={{
pathname: '/login',
query: { callbackUrl: window.location.origin + router.asPath, multiAuth: true }
}}
className='text-reset fw-bold'
>
<AddIcon height={20} width={20} /> another account
</Link>
</div>
</>
)
}

View File

@ -1,20 +1,16 @@
import { useState, useEffect, useMemo, useCallback } from 'react' import { useState, useEffect } from 'react'
import AccordianItem from './accordian-item' import AccordianItem from './accordian-item'
import { Input, InputUserSuggest, VariableInput, Checkbox } from './form' import { Input, InputUserSuggest, VariableInput, Checkbox } from './form'
import InputGroup from 'react-bootstrap/InputGroup' import InputGroup from 'react-bootstrap/InputGroup'
import { BOOST_MIN, BOOST_MULT, MAX_FORWARDS, SSR } from '@/lib/constants' import { BOOST_MIN, BOOST_MULT, MAX_FORWARDS } from '@/lib/constants'
import { DEFAULT_CROSSPOSTING_RELAYS } from '@/lib/nostr' import { DEFAULT_CROSSPOSTING_RELAYS } from '@/lib/nostr'
import Info from './info' import Info from './info'
import { abbrNum, numWithUnits } from '@/lib/format' import { numWithUnits } from '@/lib/format'
import styles from './adv-post-form.module.css' import styles from './adv-post-form.module.css'
import { useMe } from './me' import { useMe } from './me'
import { useFeeButton } from './fee-button' import { useFeeButton } from './fee-button'
import { useRouter } from 'next/router' import { useRouter } from 'next/router'
import { useFormikContext } from 'formik' import { useFormikContext } from 'formik'
import { gql, useQuery } from '@apollo/client'
import useDebounceCallback from './use-debounce-callback'
import { Button } from 'react-bootstrap'
import classNames from 'classnames'
const EMPTY_FORWARD = { nym: '', pct: '' } const EMPTY_FORWARD = { nym: '', pct: '' }
@ -30,153 +26,9 @@ const FormStatus = {
ERROR: 'error' ERROR: 'error'
} }
export function BoostHelp () { export default function AdvPostForm ({ children, item, storageKeyPrefix }) {
return ( const me = useMe()
<ol style={{ lineHeight: 1.25 }}> const { merge } = useFeeButton()
<li>Boost ranks items higher based on the amount</li>
<li>The highest boost in a territory over the last 30 days is pinned to the top of the territory</li>
<li>The highest boost across all territories over the last 30 days is pinned to the top of the homepage</li>
<li>The minimum boost is {numWithUnits(BOOST_MIN, { abbreviate: false })}</li>
<li>Each {numWithUnits(BOOST_MULT, { abbreviate: false })} of boost is equivalent to a zap-vote from a maximally trusted stacker (very rare)
<ul>
<li>e.g. {numWithUnits(BOOST_MULT * 5, { abbreviate: false })} is like five zap-votes from a maximally trusted stacker</li>
</ul>
</li>
<li>boost can take a few minutes to show higher ranking in feed</li>
<li>100% of boost goes to the territory founder and top stackers as rewards</li>
</ol>
)
}
export function BoostInput ({ onChange, ...props }) {
const feeButton = useFeeButton()
let merge
if (feeButton) {
({ merge } = feeButton)
}
return (
<Input
label={
<div className='d-flex align-items-center'>boost
<Info>
<BoostHelp />
</Info>
</div>
}
name='boost'
onChange={(_, e) => {
merge?.({
boost: {
term: `+ ${e.target.value}`,
label: 'boost',
op: '+',
modifier: cost => cost + Number(e.target.value)
}
})
onChange && onChange(_, e)
}}
hint={<span className='text-muted'>ranks posts higher temporarily based on the amount</span>}
append={<InputGroup.Text className='text-monospace'>sats</InputGroup.Text>}
{...props}
/>
)
}
const BoostMaxes = ({ subName, homeMax, subMax, boost, updateBoost }) => {
return (
<div className='d-flex flex-row mb-2'>
<Button
className={classNames(styles.boostMax, 'me-2', homeMax + BOOST_MULT <= (boost || 0) && 'invisible')}
size='sm'
onClick={() => updateBoost(homeMax + BOOST_MULT)}
>
{abbrNum(homeMax + BOOST_MULT)} <small>top of homepage</small>
</Button>
{subName &&
<Button
className={classNames(styles.boostMax, subMax + BOOST_MULT <= (boost || 0) && 'invisible')}
size='sm'
onClick={() => updateBoost(subMax + BOOST_MULT)}
>
{abbrNum(subMax + BOOST_MULT)} <small>top of ~{subName}</small>
</Button>}
</div>
)
}
// act means we are adding to existing boost
export function BoostItemInput ({ item, sub, act = false, ...props }) {
// act adds boost to existing boost
const existingBoost = act ? Number(item?.boost || 0) : 0
const [boost, setBoost] = useState(act ? 0 : Number(item?.boost || 0))
const { data, previousData, refetch } = useQuery(gql`
query BoostPosition($sub: String, $id: ID, $boost: Int) {
boostPosition(sub: $sub, id: $id, boost: $boost) {
home
sub
homeMaxBoost
subMaxBoost
}
}`,
{
variables: { sub: item?.subName || sub?.name, boost: existingBoost + boost, id: item?.id },
fetchPolicy: 'cache-and-network',
skip: !!item?.parentId || SSR
})
const getPositionDebounce = useDebounceCallback((...args) => refetch(...args), 1000, [refetch])
const updateBoost = useCallback((boost) => {
const boostToUse = Number(boost || 0)
setBoost(boostToUse)
getPositionDebounce({ sub: item?.subName || sub?.name, boost: Number(existingBoost + boostToUse), id: item?.id })
}, [getPositionDebounce, item?.id, item?.subName, sub?.name, existingBoost])
const dat = data || previousData
const boostMessage = useMemo(() => {
if (!item?.parentId && boost >= BOOST_MULT) {
if (dat?.boostPosition?.home || dat?.boostPosition?.sub || boost > dat?.boostPosition?.homeMaxBoost || boost > dat?.boostPosition?.subMaxBoost) {
const boostPinning = []
if (dat?.boostPosition?.home || boost > dat?.boostPosition?.homeMaxBoost) {
boostPinning.push('homepage')
}
if ((item?.subName || sub?.name) && (dat?.boostPosition?.sub || boost > dat?.boostPosition?.subMaxBoost)) {
boostPinning.push(`~${item?.subName || sub?.name}`)
}
return `pins to the top of ${boostPinning.join(' and ')}`
}
}
return 'ranks posts higher based on the amount'
}, [boost, dat?.boostPosition?.home, dat?.boostPosition?.sub, item?.subName, sub?.name])
return (
<>
<BoostInput
hint={<span className='text-muted'>{boostMessage}</span>}
onChange={(_, e) => {
if (e.target.value >= 0) {
updateBoost(Number(e.target.value))
}
}}
overrideValue={boost}
{...props}
groupClassName='mb-1'
/>
{!item?.parentId &&
<BoostMaxes
subName={item?.subName || sub?.name}
homeMax={(dat?.boostPosition?.homeMaxBoost || 0) - existingBoost}
subMax={(dat?.boostPosition?.subMaxBoost || 0) - existingBoost}
boost={existingBoost + boost}
updateBoost={updateBoost}
/>}
</>
)
}
export default function AdvPostForm ({ children, item, sub, storageKeyPrefix }) {
const { me } = useMe()
const router = useRouter() const router = useRouter()
const [itemType, setItemType] = useState() const [itemType, setItemType] = useState()
const formik = useFormikContext() const formik = useFormikContext()
@ -259,7 +111,39 @@ export default function AdvPostForm ({ children, item, sub, storageKeyPrefix })
body={ body={
<> <>
{children} {children}
<BoostItemInput item={item} sub={sub} /> <Input
label={
<div className='d-flex align-items-center'>boost
<Info>
<ol className='fw-bold'>
<li>Boost ranks posts higher temporarily based on the amount</li>
<li>The minimum boost is {numWithUnits(BOOST_MIN, { abbreviate: false })}</li>
<li>Each {numWithUnits(BOOST_MULT, { abbreviate: false })} of boost is equivalent to one trusted upvote
<ul>
<li>e.g. {numWithUnits(BOOST_MULT * 5, { abbreviate: false })} is like 5 votes</li>
</ul>
</li>
<li>The decay of boost "votes" increases at 1.25x the rate of organic votes
<ul>
<li>i.e. boost votes fall out of ranking faster</li>
</ul>
</li>
<li>100% of sats from boost are given back to top stackers as rewards</li>
</ol>
</Info>
</div>
}
name='boost'
onChange={(_, e) => merge({
boost: {
term: `+ ${e.target.value}`,
label: 'boost',
modifier: cost => cost + Number(e.target.value)
}
})}
hint={<span className='text-muted'>ranks posts higher temporarily based on the amount</span>}
append={<InputGroup.Text className='text-monospace'>sats</InputGroup.Text>}
/>
<VariableInput <VariableInput
label='forward sats to' label='forward sats to'
name='forward' name='forward'
@ -295,7 +179,7 @@ export default function AdvPostForm ({ children, item, sub, storageKeyPrefix })
label={ label={
<div className='d-flex align-items-center'>crosspost to nostr <div className='d-flex align-items-center'>crosspost to nostr
<Info> <Info>
<ul> <ul className='fw-bold'>
{renderCrosspostDetails(itemType)} {renderCrosspostDetails(itemType)}
<li>requires NIP-07 extension for signing</li> <li>requires NIP-07 extension for signing</li>
<li>we use your NIP-05 relays if set</li> <li>we use your NIP-05 relays if set</li>

View File

@ -9,11 +9,4 @@
display: flex; display: flex;
flex: 0 1 fit-content; flex: 0 1 fit-content;
height: fit-content; height: fit-content;
}
.boostMax small {
font-weight: 400;
margin-left: 0.25rem;
margin-right: 0.25rem;
opacity: 0.5;
} }

View File

@ -1,9 +1,8 @@
import { InputGroup } from 'react-bootstrap' import { InputGroup } from 'react-bootstrap'
import { Input } from './form' import { Checkbox, Input } from './form'
import { useMe } from './me' import { useMe } from './me'
import { useEffect, useState } from 'react' import { useEffect, useState } from 'react'
import { isNumber } from '@/lib/format' import { isNumber } from 'mathjs'
import Link from 'next/link'
function autoWithdrawThreshold ({ me }) { function autoWithdrawThreshold ({ me }) {
return isNumber(me?.privates?.autoWithdrawThreshold) ? me?.privates?.autoWithdrawThreshold : 10000 return isNumber(me?.privates?.autoWithdrawThreshold) ? me?.privates?.autoWithdrawThreshold : 10000
@ -12,13 +11,12 @@ function autoWithdrawThreshold ({ me }) {
export function autowithdrawInitial ({ me }) { export function autowithdrawInitial ({ me }) {
return { return {
autoWithdrawThreshold: autoWithdrawThreshold({ me }), autoWithdrawThreshold: autoWithdrawThreshold({ me }),
autoWithdrawMaxFeePercent: isNumber(me?.privates?.autoWithdrawMaxFeePercent) ? me?.privates?.autoWithdrawMaxFeePercent : 1, autoWithdrawMaxFeePercent: isNumber(me?.privates?.autoWithdrawMaxFeePercent) ? me?.privates?.autoWithdrawMaxFeePercent : 1
autoWithdrawMaxFeeTotal: isNumber(me?.privates?.autoWithdrawMaxFeeTotal) ? me?.privates?.autoWithdrawMaxFeeTotal : 1
} }
} }
export function AutowithdrawSettings () { export function AutowithdrawSettings ({ wallet }) {
const { me } = useMe() const me = useMe()
const threshold = autoWithdrawThreshold({ me }) const threshold = autoWithdrawThreshold({ me })
const [sendThreshold, setSendThreshold] = useState(Math.max(Math.floor(threshold / 10), 1)) const [sendThreshold, setSendThreshold] = useState(Math.max(Math.floor(threshold / 10), 1))
@ -29,6 +27,12 @@ export function AutowithdrawSettings () {
return ( return (
<> <>
<Checkbox
disabled={!wallet.isConfigured}
label='enabled'
id='enabled'
name='enabled'
/>
<div className='my-4 border border-3 rounded'> <div className='my-4 border border-3 rounded'>
<div className='p-3'> <div className='p-3'>
<h3 className='text-center text-muted'>desired balance</h3> <h3 className='text-center text-muted'>desired balance</h3>
@ -44,30 +48,13 @@ export function AutowithdrawSettings () {
append={<InputGroup.Text className='text-monospace'>sats</InputGroup.Text>} append={<InputGroup.Text className='text-monospace'>sats</InputGroup.Text>}
required required
/> />
<h3 className='text-center text-muted pt-3'>network fees</h3>
<h6 className='text-center pb-3'>
we'll use whichever setting is higher during{' '}
<Link
target='_blank'
href='https://docs.lightning.engineering/the-lightning-network/pathfinding'
rel='noreferrer'
>pathfinding
</Link>
</h6>
<Input <Input
label='max fee rate' label='max fee'
name='autoWithdrawMaxFeePercent' name='autoWithdrawMaxFeePercent'
hint='max fee as percent of withdrawal amount' hint='max fee as percent of withdrawal amount'
append={<InputGroup.Text>%</InputGroup.Text>} append={<InputGroup.Text>%</InputGroup.Text>}
required required
/> />
<Input
label='max fee total'
name='autoWithdrawMaxFeeTotal'
hint='max fee for any withdrawal amount'
append={<InputGroup.Text className='text-monospace'>sats</InputGroup.Text>}
required
/>
</div> </div>
</div> </div>
</> </>

View File

@ -5,7 +5,7 @@ import BootstrapForm from 'react-bootstrap/Form'
import EditImage from '@/svgs/image-edit-fill.svg' import EditImage from '@/svgs/image-edit-fill.svg'
import Moon from '@/svgs/moon-fill.svg' import Moon from '@/svgs/moon-fill.svg'
import { useShowModal } from './modal' import { useShowModal } from './modal'
import { FileUpload } from './file-upload' import { ImageUpload } from './image'
export default function Avatar ({ onSuccess }) { export default function Avatar ({ onSuccess }) {
const [uploading, setUploading] = useState() const [uploading, setUploading] = useState()
@ -49,8 +49,7 @@ export default function Avatar ({ onSuccess }) {
} }
return ( return (
<FileUpload <ImageUpload
allow='image/*'
avatar avatar
onError={e => { onError={e => {
console.log(e) console.log(e)
@ -85,6 +84,6 @@ export default function Avatar ({ onSuccess }) {
? <Moon className='fill-white spin' /> ? <Moon className='fill-white spin' />
: <EditImage className='fill-white' />} : <EditImage className='fill-white' />}
</div> </div>
</FileUpload> </ImageUpload>
) )
} }

View File

@ -1,87 +0,0 @@
import OverlayTrigger from 'react-bootstrap/OverlayTrigger'
import Tooltip from 'react-bootstrap/Tooltip'
import CowboyHatIcon from '@/svgs/cowboy.svg'
import AnonIcon from '@/svgs/spy-fill.svg'
import { numWithUnits } from '@/lib/format'
import { USER_ID } from '@/lib/constants'
import GunIcon from '@/svgs/revolver.svg'
import HorseIcon from '@/svgs/horse.svg'
import classNames from 'classnames'
const BADGES = [
{
icon: CowboyHatIcon,
streakName: 'streak'
},
{
icon: HorseIcon,
streakName: 'horseStreak'
},
{
icon: GunIcon,
streakName: 'gunStreak',
sizeDelta: 2
}
]
export default function Badges ({ user, badge, className = 'ms-1', badgeClassName, spacingClassName = 'ms-1', height = 16, width = 16 }) {
if (!user || Number(user.id) === USER_ID.ad) return null
if (Number(user.id) === USER_ID.anon) {
return (
<BadgeTooltip overlayText='anonymous'>
<span className={className}><AnonIcon className={`${badgeClassName} align-middle`} height={height} width={width} /></span>
</BadgeTooltip>
)
}
return (
<span className={className}>
{BADGES.map(({ icon, streakName, sizeDelta }, i) => (
<SNBadge
key={streakName}
user={user}
badge={badge}
streakName={streakName}
badgeClassName={classNames(badgeClassName, i > 0 && spacingClassName)}
IconForBadge={icon}
height={height}
width={width}
sizeDelta={sizeDelta}
/>
))}
</span>
)
}
function SNBadge ({ user, badge, streakName, badgeClassName, IconForBadge, height = 16, width = 16, sizeDelta = 0 }) {
const streak = user.optional[streakName]
if (streak === null) {
return null
}
return (
<BadgeTooltip
overlayText={streak
? `${numWithUnits(streak, { abbreviate: false, unitSingular: 'day', unitPlural: 'days' })}`
: 'new'}
>
<span><IconForBadge className={badgeClassName} height={height + sizeDelta} width={width + sizeDelta} /></span>
</BadgeTooltip>
)
}
export function BadgeTooltip ({ children, overlayText, placement }) {
return (
<OverlayTrigger
placement={placement || 'bottom'}
overlay={
<Tooltip style={{ position: 'fixed' }}>
{overlayText}
</Tooltip>
}
trigger={['hover', 'focus']}
>
{children}
</OverlayTrigger>
)
}

View File

@ -5,11 +5,11 @@ import { useMe } from '@/components/me'
import { useMutation } from '@apollo/client' import { useMutation } from '@apollo/client'
import { WELCOME_BANNER_MUTATION } from '@/fragments/users' import { WELCOME_BANNER_MUTATION } from '@/fragments/users'
import { useToast } from '@/components/toast' import { useToast } from '@/components/toast'
import Link from 'next/link' import { BALANCE_LIMIT_MSATS } from '@/lib/constants'
import AccordianItem from '@/components/accordian-item' import { msatsToSats, numWithUnits } from '@/lib/format'
export function WelcomeBanner ({ Banner }) { export function WelcomeBanner ({ Banner }) {
const { me } = useMe() const me = useMe()
const toaster = useToast() const toaster = useToast()
const [hidden, setHidden] = useState(true) const [hidden, setHidden] = useState(true)
const handleClose = async () => { const handleClose = async () => {
@ -70,7 +70,7 @@ export function WelcomeBanner ({ Banner }) {
} }
export function MadnessBanner ({ handleClose }) { export function MadnessBanner ({ handleClose }) {
const { me } = useMe() const me = useMe()
return ( return (
<Alert className={styles.banner} key='info' variant='info' onClose={handleClose} dismissible> <Alert className={styles.banner} key='info' variant='info' onClose={handleClose} dismissible>
<Alert.Heading> <Alert.Heading>
@ -101,17 +101,39 @@ export function MadnessBanner ({ handleClose }) {
) )
} }
export function WalletSecurityBanner ({ isActive }) { export function WalletLimitBanner () {
const me = useMe()
const limitReached = me?.privates?.sats >= msatsToSats(BALANCE_LIMIT_MSATS)
if (!me || !limitReached) return
return ( return (
<Alert className={styles.banner} key='info' variant='warning'> <Alert className={styles.banner} key='info' variant='warning'>
<Alert.Heading> <Alert.Heading>
Gunslingin' Safety Tips Your wallet is over the current limit ({numWithUnits(msatsToSats(BALANCE_LIMIT_MSATS))})
</Alert.Heading> </Alert.Heading>
<p className='mb-3 line-height-md'> <p className='mb-1'>
Listen up, pardner! Put a limit on yer spendin' wallet or hook up a wallet that's only for Stacker News. It'll keep them varmints from cleanin' out yer whole goldmine if they rustle up yer wallet. Deposits to your wallet from <strong>outside</strong> of SN are blocked.
</p> </p>
<p className='line-height-md'> <p>
Your spending wallet's credentials are never sent to our servers in plain text. To sync across devices, <Alert.Link as={Link} href='/settings/passphrase'>enable device sync in your settings</Alert.Link>. Please spend or withdraw sats to restore full wallet functionality.
</p>
</Alert>
)
}
export function WalletSecurityBanner () {
return (
<Alert className={styles.banner} key='info' variant='warning'>
<Alert.Heading>
Wallet Security Disclaimer
</Alert.Heading>
<p className='mb-1'>
Your wallet's credentials are stored in the browser and never go to the server.<br />
However, you should definitely <strong>set a budget in your wallet</strong>.
</p>
<p>
Also, for the time being, you will have to reenter your credentials on other devices.
</p> </p>
</Alert> </Alert>
) )
@ -124,24 +146,3 @@ export function AuthBanner () {
</Alert> </Alert>
) )
} }
export function MultiAuthErrorBanner ({ errors }) {
return (
<Alert className={styles.banner} key='info' variant='danger'>
<div className='fw-bold mb-3'>Account switching is currently unavailable</div>
<AccordianItem
className='my-3'
header='We have detected the following issues:'
headerColor='var(--bs-danger-text-emphasis)'
body={
<ul>
{errors.map((err, i) => (
<li key={i}>{err}</li>
))}
</ul>
}
/>
<div className='mt-3'>To resolve these issues, please sign out and sign in again.</div>
</Alert>
)
}

View File

@ -17,8 +17,7 @@ export default function BookmarkDropdownItem ({ item: { id, meBookmark } }) {
id: `Item:${id}`, id: `Item:${id}`,
fields: { fields: {
meBookmark: () => bookmarkItem.meBookmark meBookmark: () => bookmarkItem.meBookmark
}, }
optimistic: true
}) })
} }
} }

View File

@ -1,65 +0,0 @@
import { useShowModal } from './modal'
import { useToast } from './toast'
import ItemAct from './item-act'
import AccordianItem from './accordian-item'
import { useMemo } from 'react'
import getColor from '@/lib/rainbow'
import BoostIcon from '@/svgs/arrow-up-double-line.svg'
import styles from './upvote.module.css'
import { BoostHelp } from './adv-post-form'
import { BOOST_MULT } from '@/lib/constants'
import classNames from 'classnames'
export default function Boost ({ item, className, ...props }) {
const { boost } = item
const [color, nextColor] = useMemo(() => [getColor(boost), getColor(boost + BOOST_MULT)], [boost])
const style = useMemo(() => ({
'--hover-fill': nextColor,
'--hover-filter': `drop-shadow(0 0 6px ${nextColor}90)`,
'--fill': color,
'--filter': `drop-shadow(0 0 6px ${color}90)`
}), [color, nextColor])
return (
<Booster
item={item} As={oprops =>
<div className='upvoteParent'>
<div
className={classNames(styles.upvoteWrapper, item.deletedAt && styles.noSelfTips)}
>
<BoostIcon
{...props}
{...oprops}
style={style}
width={26}
height={26}
className={classNames(styles.boost, className, boost && styles.boosted)}
/>
</div>
</div>}
/>
)
}
function Booster ({ item, As, children }) {
const toaster = useToast()
const showModal = useShowModal()
return (
<As
onClick={async () => {
try {
showModal(onClose =>
<ItemAct onClose={onClose} item={item} act='BOOST' step={BOOST_MULT}>
<AccordianItem header='what is boost?' body={<BoostHelp />} />
</ItemAct>)
} catch (error) {
toaster.danger('failed to boost item')
}
}}
>
{children}
</As>
)
}

View File

@ -23,7 +23,7 @@ export function BountyForm ({
children children
}) { }) {
const client = useApolloClient() const client = useApolloClient()
const { me } = useMe() const me = useMe()
const schema = bountySchema({ client, me, existingBoost: item?.boost }) const schema = bountySchema({ client, me, existingBoost: item?.boost })
const onSubmit = useItemSubmit(UPSERT_BOUNTY, { item, sub }) const onSubmit = useItemSubmit(UPSERT_BOUNTY, { item, sub })
@ -73,14 +73,14 @@ export function BountyForm ({
hint={ hint={
editThreshold editThreshold
? ( ? (
<div className='text-muted fw-bold font-monospace'> <div className='text-muted fw-bold'>
<Countdown date={editThreshold} /> <Countdown date={editThreshold} />
</div> </div>
) )
: null : null
} }
/> />
<AdvPostForm storageKeyPrefix={storageKeyPrefix} item={item} sub={sub} /> <AdvPostForm storageKeyPrefix={storageKeyPrefix} item={item} />
<ItemButtonBar itemId={item?.id} canDelete={false} /> <ItemButtonBar itemId={item?.id} canDelete={false} />
</Form> </Form>
) )

View File

@ -4,6 +4,6 @@ import Button from 'react-bootstrap/Button'
export default function CancelButton ({ onClick }) { export default function CancelButton ({ onClick }) {
const router = useRouter() const router = useRouter()
return ( return (
<Button className='me-3 text-muted nav-link fw-bold' variant='link' onClick={onClick || (() => router.back())}>cancel</Button> <Button className='me-4 text-muted nav-link fw-bold' variant='link' onClick={onClick || (() => router.back())}>cancel</Button>
) )
} }

View File

@ -1,133 +0,0 @@
import { createContext, useCallback, useContext, useEffect, useMemo, useRef, useState } from 'react'
import classNames from 'classnames'
import ArrowLeft from '@/svgs/arrow-left-line.svg'
import ArrowRight from '@/svgs/arrow-right-line.svg'
import styles from './carousel.module.css'
import { useShowModal } from './modal'
import { Dropdown } from 'react-bootstrap'
function useSwiping ({ moveLeft, moveRight }) {
const [touchStartX, setTouchStartX] = useState(null)
const onTouchStart = useCallback((e) => {
if (e.touches.length === 1) {
setTouchStartX(e.touches[0].clientX)
}
}, [])
const onTouchEnd = useCallback((e) => {
if (touchStartX !== null) {
const touchEndX = e.changedTouches[0].clientX
const diff = touchEndX - touchStartX
if (diff > 50) {
moveLeft()
} else if (diff < -50) {
moveRight()
}
setTouchStartX(null)
}
}, [touchStartX, moveLeft, moveRight])
useEffect(() => {
document.addEventListener('touchstart', onTouchStart)
document.addEventListener('touchend', onTouchEnd)
return () => {
document.removeEventListener('touchstart', onTouchStart)
document.removeEventListener('touchend', onTouchEnd)
}
}, [onTouchStart, onTouchEnd])
}
function useArrowKeys ({ moveLeft, moveRight }) {
const onKeyDown = useCallback((e) => {
if (e.key === 'ArrowLeft') {
moveLeft()
} else if (e.key === 'ArrowRight') {
moveRight()
}
}, [moveLeft, moveRight])
useEffect(() => {
document.addEventListener('keydown', onKeyDown)
return () => document.removeEventListener('keydown', onKeyDown)
}, [onKeyDown])
}
export default function Carousel ({ close, mediaArr, src, originalSrc, setOptions }) {
const [index, setIndex] = useState(mediaArr.findIndex(([key]) => key === src))
const [currentSrc, canGoLeft, canGoRight] = useMemo(() => {
return [mediaArr[index][0], index > 0, index < mediaArr.length - 1]
}, [mediaArr, index])
const moveLeft = useCallback(() => {
setIndex(i => Math.max(0, i - 1))
}, [setIndex])
const moveRight = useCallback(() => {
setIndex(i => Math.min(mediaArr.length - 1, i + 1))
}, [setIndex, mediaArr.length])
useSwiping({ moveLeft, moveRight })
useArrowKeys({ moveLeft, moveRight })
return (
<div className={styles.fullScreenContainer} onClick={close}>
<img className={styles.fullScreen} src={currentSrc} />
<div className={styles.fullScreenNavContainer}>
<div
className={classNames(styles.fullScreenNav, !canGoLeft && 'invisible', styles.left)}
onClick={(e) => {
e.stopPropagation()
moveLeft()
}}
>
<ArrowLeft width={34} height={34} />
</div>
<div
className={classNames(styles.fullScreenNav, !canGoRight && 'invisible', styles.right)}
onClick={(e) => {
e.stopPropagation()
moveRight()
}}
>
<ArrowRight width={34} height={34} />
</div>
</div>
</div>
)
}
const CarouselContext = createContext()
function CarouselOverflow ({ originalSrc, rel }) {
return <Dropdown.Item href={originalSrc} rel={rel} target='_blank'>view original</Dropdown.Item>
}
export function CarouselProvider ({ children }) {
const media = useRef(new Map())
const showModal = useShowModal()
const showCarousel = useCallback(({ src }) => {
showModal((close, setOptions) => {
return <Carousel close={close} mediaArr={Array.from(media.current.entries())} src={src} setOptions={setOptions} />
}, {
fullScreen: true,
overflow: <CarouselOverflow {...media.current.get(src)} />
})
}, [showModal, media.current])
const addMedia = useCallback(({ src, originalSrc, rel }) => {
media.current.set(src, { src, originalSrc, rel })
}, [media.current])
const removeMedia = useCallback((src) => {
media.current.delete(src)
}, [media.current])
const value = useMemo(() => ({ showCarousel, addMedia, removeMedia }), [showCarousel, addMedia, removeMedia])
return <CarouselContext.Provider value={value}>{children}</CarouselContext.Provider>
}
export function useCarousel () {
return useContext(CarouselContext)
}

View File

@ -1,63 +0,0 @@
div.fullScreenNavContainer {
height: 100%;
width: 100%;
position: absolute;
top: 0;
left: 0;
pointer-events: none;
flex-direction: row;
display: flex;
justify-content: space-between;
align-items: center;
}
img.fullScreen {
cursor: zoom-out !important;
max-height: 100%;
max-width: 100vw;
min-width: 0;
min-height: 0;
align-self: center;
justify-self: center;
user-select: none;
}
.fullScreenContainer {
--bs-columns: 1;
--bs-rows: 1;
display: grid;
width: 100%;
height: 100%;
}
div.fullScreenNav:hover > svg {
background-color: rgba(0, 0, 0, .5);
}
div.fullScreenNav {
cursor: pointer;
pointer-events: auto;
width: 72px;
height: 72px;
display: flex;
align-items: center;
}
div.fullScreenNav.left {
justify-content: flex-start;
}
div.fullScreenNav.right {
justify-content: flex-end;
}
div.fullScreenNav > svg {
border-radius: 50%;
backdrop-filter: blur(4px);
background-color: rgba(0, 0, 0, 0.7);
fill: white;
max-height: 34px;
max-width: 34px;
padding: 0.35rem;
margin: .75rem;
}

View File

@ -18,8 +18,7 @@ export default function CommentEdit ({ comment, editThreshold, onSuccess, onCanc
text () { text () {
return result.text return result.text
} }
}, }
optimistic: true
}) })
} }
}, },

View File

@ -2,7 +2,7 @@ import itemStyles from './item.module.css'
import styles from './comment.module.css' import styles from './comment.module.css'
import Text, { SearchText } from './text' import Text, { SearchText } from './text'
import Link from 'next/link' import Link from 'next/link'
import Reply from './reply' import Reply, { ReplyOnAnotherPage } from './reply'
import { useEffect, useMemo, useRef, useState } from 'react' import { useEffect, useMemo, useRef, useState } from 'react'
import UpVote from './upvote' import UpVote from './upvote'
import Eye from '@/svgs/eye-fill.svg' import Eye from '@/svgs/eye-fill.svg'
@ -25,9 +25,6 @@ import Skull from '@/svgs/death-skull.svg'
import { commentSubTreeRootId } from '@/lib/item' import { commentSubTreeRootId } from '@/lib/item'
import Pin from '@/svgs/pushpin-fill.svg' import Pin from '@/svgs/pushpin-fill.svg'
import LinkToContext from './link-to-context' import LinkToContext from './link-to-context'
import Boost from './boost-button'
import { gql, useApolloClient } from '@apollo/client'
import classNames from 'classnames'
function Parent ({ item, rootText }) { function Parent ({ item, rootText }) {
const root = useRoot() const root = useRoot()
@ -82,7 +79,6 @@ export function CommentFlat ({ item, rank, siblingComments, ...props }) {
<LinkToContext <LinkToContext
className='py-2' className='py-2'
onClick={e => { onClick={e => {
e.preventDefault()
router.push(href, as) router.push(href, as)
}} }}
href={href} href={href}
@ -97,14 +93,13 @@ export function CommentFlat ({ item, rank, siblingComments, ...props }) {
export default function Comment ({ export default function Comment ({
item, children, replyOpen, includeParent, topLevel, item, children, replyOpen, includeParent, topLevel,
rootText, noComments, noReply, truncate, depth, pin, setDisableRetry, disableRetry rootText, noComments, noReply, truncate, depth, pin
}) { }) {
const [edit, setEdit] = useState() const [edit, setEdit] = useState()
const { me } = useMe() const me = useMe()
const isHiddenFreebie = me?.privates?.satsFilter !== 0 && !item.mine && item.freebie && !item.freedFreebie const isHiddenFreebie = !me?.privates?.wildWestMode && !me?.privates?.greeterMode && !item.mine && item.freebie && !item.freedFreebie
const isDeletedChildless = item?.ncomments === 0 && item?.deletedAt
const [collapse, setCollapse] = useState( const [collapse, setCollapse] = useState(
(isHiddenFreebie || isDeletedChildless || item?.user?.meMute || (item?.outlawed && !me?.privates?.wildWestMode)) && !includeParent (isHiddenFreebie || item?.user?.meMute || (item?.outlawed && !me?.privates?.wildWestMode)) && !includeParent
? 'yep' ? 'yep'
: 'nope') : 'nope')
const ref = useRef(null) const ref = useRef(null)
@ -112,32 +107,16 @@ export default function Comment ({
const root = useRoot() const root = useRoot()
const { ref: textRef, quote, quoteReply, cancelQuote } = useQuoteReply({ text: item.text }) const { ref: textRef, quote, quoteReply, cancelQuote } = useQuoteReply({ text: item.text })
const { cache } = useApolloClient()
useEffect(() => { useEffect(() => {
const comment = cache.readFragment({
id: `Item:${router.query.commentId}`,
fragment: gql`
fragment CommentPath on Item {
path
}`
})
if (comment?.path.split('.').includes(item.id)) {
window.localStorage.setItem(`commentCollapse:${item.id}`, 'nope')
}
setCollapse(window.localStorage.getItem(`commentCollapse:${item.id}`) || collapse) setCollapse(window.localStorage.getItem(`commentCollapse:${item.id}`) || collapse)
if (Number(router.query.commentId) === Number(item.id)) { if (Number(router.query.commentId) === Number(item.id)) {
// HACK wait for other comments to uncollapse if they're collapsed // HACK wait for other comments to collapse if they're collapsed
setTimeout(() => { setTimeout(() => {
ref.current.scrollIntoView({ behavior: 'instant', block: 'start' }) ref.current.scrollIntoView({ behavior: 'instant', block: 'start' })
// make sure we can outline a comment again if it was already outlined before
ref.current.addEventListener('animationend', () => {
ref.current.classList.remove('outline-it')
}, { once: true })
ref.current.classList.add('outline-it') ref.current.classList.add('outline-it')
}, 100) }, 100)
} }
}, [item.id, cache, router.query.commentId]) }, [item.id, router.query.commentId])
useEffect(() => { useEffect(() => {
if (router.query.commentsViewedAt && if (router.query.commentsViewedAt &&
@ -147,7 +126,7 @@ export default function Comment ({
} }
}, [item.id]) }, [item.id])
const bottomedOut = depth === COMMENT_DEPTH_LIMIT || (item.comments?.comments.length === 0 && item.nDirectComments > 0) const bottomedOut = depth === COMMENT_DEPTH_LIMIT
// Don't show OP badge when anon user comments on anon user posts // Don't show OP badge when anon user comments on anon user posts
const op = root.user.name === item.user.name && Number(item.user.id) !== USER_ID.anon const op = root.user.name === item.user.name && Number(item.user.id) !== USER_ID.anon
? 'OP' ? 'OP'
@ -165,11 +144,9 @@ export default function Comment ({
<div className={`${itemStyles.item} ${styles.item}`}> <div className={`${itemStyles.item} ${styles.item}`}>
{item.outlawed && !me?.privates?.wildWestMode {item.outlawed && !me?.privates?.wildWestMode
? <Skull className={styles.dontLike} width={24} height={24} /> ? <Skull className={styles.dontLike} width={24} height={24} />
: item.mine : item.meDontLikeSats > item.meSats
? <Boost item={item} className={styles.upvote} /> ? <DownZap width={24} height={24} className={styles.dontLike} item={item} />
: item.meDontLikeSats > item.meSats : pin ? <Pin width={22} height={22} className={styles.pin} /> : <UpVote item={item} className={styles.upvote} />}
? <DownZap width={24} height={24} className={styles.dontLike} item={item} />
: pin ? <Pin width={22} height={22} className={styles.pin} /> : <UpVote item={item} className={styles.upvote} collapsed={collapse === 'yep'} />}
<div className={`${itemStyles.hunk} ${styles.hunk}`}> <div className={`${itemStyles.hunk} ${styles.hunk}`}>
<div className='d-flex align-items-center'> <div className='d-flex align-items-center'>
{item.user?.meMute && !includeParent && collapse === 'yep' {item.user?.meMute && !includeParent && collapse === 'yep'
@ -189,8 +166,6 @@ export default function Comment ({
embellishUser={op && <><span> </span><Badge bg={op === 'fwd' ? 'secondary' : 'boost'} className={`${styles.op} bg-opacity-75`}>{op}</Badge></>} embellishUser={op && <><span> </span><Badge bg={op === 'fwd' ? 'secondary' : 'boost'} className={`${styles.op} bg-opacity-75`}>{op}</Badge></>}
onQuoteReply={quoteReply} onQuoteReply={quoteReply}
nested={!includeParent} nested={!includeParent}
setDisableRetry={setDisableRetry}
disableRetry={disableRetry}
extraInfo={ extraInfo={
<> <>
{includeParent && <Parent item={item} rootText={rootText} />} {includeParent && <Parent item={item} rootText={rootText} />}
@ -200,8 +175,7 @@ export default function Comment ({
</ActionTooltip>} </ActionTooltip>}
</> </>
} }
edit={edit} onEdit={e => { setEdit(!edit) }}
toggleEdit={e => { setEdit(!edit) }}
editText={edit ? 'cancel' : 'edit'} editText={edit ? 'cancel' : 'edit'}
/>} />}
@ -249,7 +223,7 @@ export default function Comment ({
</div> </div>
{collapse !== 'yep' && ( {collapse !== 'yep' && (
bottomedOut bottomedOut
? <div className={styles.children}><div className={classNames(styles.comment, 'mt-3')}><ReplyOnAnotherPage item={item} /></div></div> ? <div className={styles.children}><ReplyOnAnotherPage item={item} /></div>
: ( : (
<div className={styles.children}> <div className={styles.children}>
{item.outlawed && !me?.privates?.wildWestMode {item.outlawed && !me?.privates?.wildWestMode
@ -260,17 +234,11 @@ export default function Comment ({
</Reply>} </Reply>}
{children} {children}
<div className={styles.comments}> <div className={styles.comments}>
{!noComments && item.comments?.comments {item.comments && !noComments
? ( ? item.comments.map((item) => (
<> <Comment depth={depth + 1} key={item.id} item={item} />
{item.comments.comments.map((item) => ( ))
<Comment depth={depth + 1} key={item.id} item={item} />
))}
{item.comments.comments.length < item.nDirectComments && <ViewAllReplies id={item.id} nhas={item.ncomments} />}
</>
)
: null} : null}
{/* TODO: add link to more comments if they're limited */}
</div> </div>
</div> </div>
) )
@ -279,34 +247,6 @@ export default function Comment ({
) )
} }
export function ViewAllReplies ({ id, nshown, nhas }) {
const text = `view all ${nhas} replies`
return (
<div className={`d-block fw-bold ${styles.comment} pb-2 ps-3`}>
<Link href={`/items/${id}`} as={`/items/${id}`} className='text-muted'>
{text}
</Link>
</div>
)
}
function ReplyOnAnotherPage ({ item }) {
const root = useRoot()
const rootId = commentSubTreeRootId(item, root)
let text = 'reply on another page'
if (item.ncomments > 0) {
text = `view all ${item.ncomments} replies`
}
return (
<Link href={`/items/${rootId}?commentId=${item.id}`} as={`/items/${rootId}`} className='d-block pb-2 fw-bold text-muted'>
{text}
</Link>
)
}
export function CommentSkeleton ({ skeletonChildren }) { export function CommentSkeleton ({ skeletonChildren }) {
return ( return (
<div className={styles.comment}> <div className={styles.comment}>

View File

@ -1,4 +1,4 @@
import { Fragment, useMemo } from 'react' import { Fragment } from 'react'
import Comment, { CommentSkeleton } from './comment' import Comment, { CommentSkeleton } from './comment'
import styles from './header.module.css' import styles from './header.module.css'
import Nav from 'react-bootstrap/Nav' import Nav from 'react-bootstrap/Nav'
@ -6,8 +6,6 @@ import Navbar from 'react-bootstrap/Navbar'
import { numWithUnits } from '@/lib/format' import { numWithUnits } from '@/lib/format'
import { defaultCommentSort } from '@/lib/item' import { defaultCommentSort } from '@/lib/item'
import { useRouter } from 'next/router' import { useRouter } from 'next/router'
import MoreFooter from './more-footer'
import { FULL_COMMENTS_THRESHOLD } from '@/lib/constants'
export function CommentsHeader ({ handleSort, pinned, bio, parentCreatedAt, commentSats }) { export function CommentsHeader ({ handleSort, pinned, bio, parentCreatedAt, commentSats }) {
const router = useRouter() const router = useRouter()
@ -62,13 +60,10 @@ export function CommentsHeader ({ handleSort, pinned, bio, parentCreatedAt, comm
) )
} }
export default function Comments ({ export default function Comments ({ parentId, pinned, bio, parentCreatedAt, commentSats, comments, ...props }) {
parentId, pinned, bio, parentCreatedAt,
commentSats, comments, commentsCursor, fetchMoreComments, ncomments, ...props
}) {
const router = useRouter() const router = useRouter()
const pins = useMemo(() => comments?.filter(({ position }) => !!position).sort((a, b) => a.position - b.position), [comments]) const pins = comments?.filter(({ position }) => !!position).sort((a, b) => a.position - b.position)
return ( return (
<> <>
@ -96,12 +91,6 @@ export default function Comments ({
{comments.filter(({ position }) => !position).map(item => ( {comments.filter(({ position }) => !position).map(item => (
<Comment depth={1} key={item.id} item={item} {...props} /> <Comment depth={1} key={item.id} item={item} {...props} />
))} ))}
{ncomments > FULL_COMMENTS_THRESHOLD &&
<MoreFooter
cursor={commentsCursor} fetchMore={fetchMoreComments} noMoreText=' '
count={comments?.length}
Skeleton={CommentsSkeleton}
/>}
</> </>
) )
} }

View File

@ -43,7 +43,7 @@ export function CompactLongCountdown (props) {
? ` ${props.formatted.hours}:${props.formatted.minutes}:${props.formatted.seconds}` ? ` ${props.formatted.hours}:${props.formatted.minutes}:${props.formatted.seconds}`
: Number(props.formatted.minutes) > 0 : Number(props.formatted.minutes) > 0
? ` ${props.formatted.minutes}:${props.formatted.seconds}` ? ` ${props.formatted.minutes}:${props.formatted.seconds}`
: Number(props.formatted.seconds) >= 0 : Number(props.formatted.seconds) > 0
? ` ${props.formatted.seconds}s` ? ` ${props.formatted.seconds}s`
: ' '} : ' '}
</> </>

View File

@ -34,36 +34,20 @@ const setTheme = (dark) => {
const listenForThemeChange = (onChange) => { const listenForThemeChange = (onChange) => {
const mql = window.matchMedia(PREFER_DARK_QUERY) const mql = window.matchMedia(PREFER_DARK_QUERY)
const onMqlChange = () => { mql.onchange = mql => {
const { user, dark } = getTheme() const { user, dark } = getTheme()
if (!user) { if (!user) {
handleThemeChange(dark) handleThemeChange(dark)
onChange({ user, dark }) onChange({ user, dark })
} }
} }
mql.addEventListener('change', onMqlChange) window.onstorage = e => {
const onStorage = (e) => {
if (e.key === STORAGE_KEY) { if (e.key === STORAGE_KEY) {
const dark = JSON.parse(e.newValue) const dark = JSON.parse(e.newValue)
setTheme(dark) setTheme(dark)
onChange({ user: true, dark }) onChange({ user: true, dark })
} }
} }
window.addEventListener('storage', onStorage)
const root = window.document.documentElement
const observer = new window.MutationObserver(() => {
const theme = root.getAttribute('data-bs-theme')
onChange(dark => ({ ...dark, dark: theme === 'dark' }))
})
observer.observe(root, { attributes: true, attributeFilter: ['data-bs-theme'] })
return () => {
observer.disconnect()
mql.removeEventListener('change', onMqlChange)
window.removeEventListener('storage', onStorage)
}
} }
export default function useDarkMode () { export default function useDarkMode () {
@ -72,7 +56,7 @@ export default function useDarkMode () {
useEffect(() => { useEffect(() => {
const { user, dark } = getTheme() const { user, dark } = getTheme()
setDark({ user, dark }) setDark({ user, dark })
return listenForThemeChange(setDark) listenForThemeChange(setDark)
}, []) }, [])
return [dark?.dark, () => { return [dark?.dark, () => {

View File

@ -30,8 +30,7 @@ export default function Delete ({ itemId, children, onDelete, type = 'post' }) {
url: () => deleteItem.url, url: () => deleteItem.url,
pollCost: () => deleteItem.pollCost, pollCost: () => deleteItem.pollCost,
deletedAt: () => deleteItem.deletedAt deletedAt: () => deleteItem.deletedAt
}, }
optimistic: true
}) })
} }
} }

View File

@ -22,7 +22,7 @@ export function DiscussionForm ({
}) { }) {
const router = useRouter() const router = useRouter()
const client = useApolloClient() const client = useApolloClient()
const { me } = useMe() const me = useMe()
const onSubmit = useItemSubmit(UPSERT_DISCUSSION, { item, sub }) const onSubmit = useItemSubmit(UPSERT_DISCUSSION, { item, sub })
const schema = discussionSchema({ client, me, existingBoost: item?.boost }) const schema = discussionSchema({ client, me, existingBoost: item?.boost })
// if Web Share Target API was used // if Web Share Target API was used
@ -76,10 +76,10 @@ export function DiscussionForm ({
name='text' name='text'
minRows={6} minRows={6}
hint={editThreshold hint={editThreshold
? <div className='text-muted fw-bold font-monospace'><Countdown date={editThreshold} /></div> ? <div className='text-muted fw-bold'><Countdown date={editThreshold} /></div>
: null} : null}
/> />
<AdvPostForm storageKeyPrefix={storageKeyPrefix} item={item} sub={sub} /> <AdvPostForm storageKeyPrefix={storageKeyPrefix} item={item} />
<ItemButtonBar itemId={item?.id} /> <ItemButtonBar itemId={item?.id} />
{!item && {!item &&
<div className={`mt-3 ${related.length > 0 ? '' : 'invisible'}`}> <div className={`mt-3 ${related.length > 0 ? '' : 'invisible'}`}>

View File

@ -17,12 +17,7 @@ export function DownZap ({ item, ...props }) {
} }
: undefined), [meDontLikeSats]) : undefined), [meDontLikeSats])
return ( return (
<DownZapper <DownZapper item={item} As={({ ...oprops }) => <Flag {...props} {...oprops} style={style} />} />
item={item} As={({ ...oprops }) =>
<div className='upvoteParent'>
<Flag {...props} {...oprops} style={style} />
</div>}
/>
) )
} }
@ -36,7 +31,7 @@ function DownZapper ({ item, As, children }) {
try { try {
showModal(onClose => showModal(onClose =>
<ItemAct <ItemAct
onClose={onClose} item={item} act='DONT_LIKE_THIS' onClose={onClose} item={item} down
> >
<AccordianItem <AccordianItem
header='what is a downzap?' body={ header='what is a downzap?' body={
@ -84,8 +79,7 @@ export function OutlawDropdownItem ({ item }) {
id: `Item:${item.id}`, id: `Item:${item.id}`,
fields: { fields: {
outlawed: () => true outlawed: () => true
}, }
optimistic: true
}) })
} }
} }

View File

@ -1,215 +0,0 @@
import { memo, useEffect, useRef, useState } from 'react'
import classNames from 'classnames'
import useDarkMode from './dark-mode'
import styles from './text.module.css'
import { Button } from 'react-bootstrap'
import { TwitterTweetEmbed } from 'react-twitter-embed'
import YouTube from 'react-youtube'
function TweetSkeleton ({ className }) {
return (
<div className={classNames(styles.tweetsSkeleton, className)}>
<div className={styles.tweetSkeleton}>
<div className={`${styles.img} clouds`} />
<div className={styles.content1}>
<div className={`${styles.line} clouds`} />
<div className={`${styles.line} clouds`} />
<div className={`${styles.line} clouds`} />
</div>
</div>
</div>
)
}
export const NostrEmbed = memo(function NostrEmbed ({ src, className, topLevel, darkMode, id }) {
const [show, setShow] = useState(false)
const iframeRef = useRef(null)
useEffect(() => {
if (!iframeRef.current) return
const setHeightFromIframe = (e) => {
if (e.origin !== 'https://njump.me' || !e?.data?.height || e.source !== iframeRef.current.contentWindow) return
iframeRef.current.height = `${e.data.height}px`
}
window?.addEventListener('message', setHeightFromIframe)
const handleIframeLoad = () => {
iframeRef.current.contentWindow.postMessage({ setDarkMode: darkMode }, '*')
}
if (iframeRef.current.complete) {
handleIframeLoad()
} else {
iframeRef.current.addEventListener('load', handleIframeLoad)
}
// https://github.com/vercel/next.js/issues/39451
iframeRef.current.src = `https://njump.me/${id}?embed=yes`
return () => {
window?.removeEventListener('message', setHeightFromIframe)
iframeRef.current?.removeEventListener('load', handleIframeLoad)
}
}, [iframeRef.current, darkMode])
return (
<div className={classNames(styles.nostrContainer, !show && styles.twitterContained, className)}>
<iframe
ref={iframeRef}
width={topLevel ? '550px' : '350px'}
style={{ maxWidth: '100%' }}
height={iframeRef.current?.height || (topLevel ? '200px' : '150px')}
frameBorder='0'
sandbox='allow-scripts allow-same-origin allow-popups allow-popups-to-escape-sandbox'
allow=''
/>
{!show &&
<Button size='md' variant='info' className={styles.twitterShowFull} onClick={() => setShow(true)}>
<div>show full note</div>
<small className='fw-normal fst-italic'>or other stuff</small>
</Button>}
</div>
)
})
const SpotifyEmbed = function SpotifyEmbed ({ src, className }) {
const iframeRef = useRef(null)
// https://open.spotify.com/track/1KFxcj3MZrpBGiGA8ZWriv?si=f024c3aa52294aa1
// Remove any additional path segments
const url = new URL(src)
url.pathname = url.pathname.replace(/\/intl-\w+\//, '/')
useEffect(() => {
if (!iframeRef.current) return
const id = url.pathname.split('/').pop()
// https://developer.spotify.com/documentation/embeds/tutorials/using-the-iframe-api
window.onSpotifyIframeApiReady = (IFrameAPI) => {
const options = {
uri: `spotify:episode:${id}`
}
const callback = (EmbedController) => {}
IFrameAPI.createController(iframeRef.current, options, callback)
}
return () => { window.onSpotifyIframeApiReady = null }
}, [iframeRef.current, url.pathname])
return (
<div className={classNames(styles.spotifyWrapper, className)}>
<iframe
ref={iframeRef}
title='Spotify Web Player'
src={`https://open.spotify.com/embed${url.pathname}`}
width='100%'
height='152'
allowFullScreen
frameBorder='0'
allow='encrypted-media; clipboard-write;'
style={{ borderRadius: '12px' }}
sandbox='allow-scripts allow-popups allow-popups-to-escape-sandbox allow-same-origin allow-presentation'
/>
</div>
)
}
const Embed = memo(function Embed ({ src, provider, id, meta, className, topLevel, onError }) {
const [darkMode] = useDarkMode()
const [overflowing, setOverflowing] = useState(true)
const [show, setShow] = useState(false)
// This Twitter embed could use similar logic to the video embeds below
if (provider === 'twitter') {
return (
<div className={classNames(styles.twitterContainer, !show && styles.twitterContained, className)}>
<TwitterTweetEmbed
tweetId={id}
options={{ theme: darkMode ? 'dark' : 'light', width: topLevel ? '550px' : '350px' }}
key={darkMode ? '1' : '2'}
placeholder={<TweetSkeleton className={className} />}
onLoad={() => setOverflowing(true)}
/>
{overflowing && !show &&
<Button size='lg' variant='info' className={styles.twitterShowFull} onClick={() => setShow(true)}>
show full tweet
</Button>}
</div>
)
}
if (provider === 'nostr') {
return (
<NostrEmbed src={src} className={className} topLevel={topLevel} id={id} darkMode={darkMode} />
)
}
if (provider === 'wavlake') {
return (
<div className={classNames(styles.wavlakeWrapper, className)}>
<iframe
src={`https://embed.wavlake.com/track/${id}`} width='100%' height='380' frameBorder='0'
allow='encrypted-media'
sandbox='allow-scripts allow-popups allow-popups-to-escape-sandbox allow-forms allow-same-origin'
/>
</div>
)
}
if (provider === 'spotify') {
return (
<SpotifyEmbed src={src} className={className} />
)
}
if (provider === 'youtube') {
return (
<div className={classNames(styles.videoWrapper, className)}>
<YouTube
videoId={id} className={styles.videoContainer} opts={{
playerVars: {
start: meta?.start || 0
}
}}
/>
</div>
)
}
if (provider === 'rumble') {
return (
<div className={classNames(styles.videoWrapper, className)}>
<div className={styles.videoContainer}>
<iframe
title='Rumble Video'
allowFullScreen
src={meta?.href}
sandbox='allow-scripts'
/>
</div>
</div>
)
}
if (provider === 'peertube') {
return (
<div className={classNames(styles.videoWrapper, className)}>
<div className={styles.videoContainer}>
<iframe
title='PeerTube Video'
allowFullScreen
src={meta?.href}
sandbox='allow-scripts'
/>
</div>
</div>
)
}
return null
})
export default Embed

View File

@ -6,7 +6,7 @@ import copy from 'clipboard-copy'
import { LoggerContext } from './logger' import { LoggerContext } from './logger'
import Button from 'react-bootstrap/Button' import Button from 'react-bootstrap/Button'
import { useToast } from './toast' import { useToast } from './toast'
import { decodeMinifiedStackTrace } from '@/lib/stacktrace'
class ErrorBoundary extends Component { class ErrorBoundary extends Component {
constructor (props) { constructor (props) {
super(props) super(props)
@ -27,7 +27,7 @@ class ErrorBoundary extends Component {
getErrorDetails () { getErrorDetails () {
let details = this.state.error.stack let details = this.state.error.stack
if (this.state.errorInfo?.componentStack) { if (this.state.errorInfo?.componentStack) {
details += `\n\nComponent stack:\n ${this.state.errorInfo.componentStack}` details += `\n\nComponent stack:${this.state.errorInfo.componentStack}`
} }
return details return details
} }
@ -69,8 +69,7 @@ const CopyErrorButton = ({ errorDetails }) => {
const toaster = useToast() const toaster = useToast()
const onClick = async () => { const onClick = async () => {
try { try {
const decodedDetails = await decodeMinifiedStackTrace(errorDetails) await copy(errorDetails)
await copy(decodedDetails)
toaster?.success?.('copied') toaster?.success?.('copied')
} catch (err) { } catch (err) {
console.error(err) console.error(err)

Some files were not shown because too many files have changed in this diff Show More