sam local invoke "Dalleanek" -e tests/jtest.jsonThere-is
- https://github.com/janek/dalleanek - main and dev branches
- deno deploy - strangely doesn’t show the dev deployment
Old
- serverless dashboard
- has an “interact” tab
Testing and debugging
- open
Home → Projects/dalleanekin Arc for all links
- use serverless dashboard for test events
- write a message to the live bot as a test
Deployment
serverless deploy
older
pip install -r requirements.txt -t vendored
cmd+shift+r→serverless invoke local --function post --path tests/event.json(~2-10sec)- (doesn’t work as of 3.05.23)
- quick open websites
cmd+shift+klambda,cmd+shift+klogs
One-time config
- get telegram token from Botfather
- set telegram webhook with curl
- get Dalle token from “tasks” thing in Network tab after generating at labs.openai.com
- set a Lambda function URL: https://docs.aws.amazon.com/lambda/latest/dg/urls-configuration.html
3.05 log
- trying to use openai’s official package
- remembered how hard it is with packages for this
- noted that you can just make a web request instead of using the lib. should be way easier, no?
Debugging lambda infinite loop on content filter failure
Status log
V1: make.com + lambda
- getting images from DALL•E works
- error handling is problematic
- V1: loop completed: whitelist checking works, generating images and getting them in a message works
V2: lambda only
- connected AWS in VSCode
- set webhook in Telegram?
- test if make one works
- test if there’s a getWebhook
- went a tiny bit too far with a tutorial that didn’t work, went back to a simpler tutorial that looks good
- ok, lambda gets a message and responds
- realized that one “life” of a function doesn’t have state and such - e.g. can’t deal well with someone just spamming. Any state would have to go through a DB, which can be fine, but keep it in mind
- how to test locally?
- not sure how to run it locally if it’s webhook-based, but can’t be too hard
- but if it runs, does online have to be disabled, will there be 2 webhooks?
- don’t need a full dev version to test, just define a test event locally and run it
- serverless and passing event wokrs. Had to print an event into the bot to know what it looks like, but worked well. For some reason cannot save stuff to return, but
Technological debt
- Dependencies are added in two different ways: vendored and layers (see )
- python telegram bot library doesn’t have a pre-made ARN available
- dalle2 library has to be vendored because I’ve made a lot of changes
- unfortunately, PIL cannot be vendored (see SO). I tried adding to serverless config, but it didn’t work fast enough, so I used Klayers
- Relevant lambda settings - perhaps they can be exported, but at the moment aren’t saved anywhere
- IAM user creation
- create function from serverless based on the tutorial linked in this article
- throttling and concurrency
- retries (warning: this only applies to async, so I think it doesn’t do anything here)
- timeout
- → they can most likely be defined in
serverless
TODO:
- maintenance into the bot
- whitelisting into the bot
- use files (from Johannes)
- /g as alias (from piotr)
- gf for files?
- error handling
- authorization error (bad token)
- test
- any OpenAI error
- low credits
- don’t worry about erorr handling for now, also consider they need to be converted to json for lambda anyway
Perspectives and ides
- this is for fun and don’t put in too much time (?)
- should be against TOS, but is it really? discord as the only source
- Grudzień wrote to OpenAI about GPT-3, it’s easy and good to write (from privaterelay or svt, write it first though)
- write to the person that’s in charge of the artist program
- wanting to be part of the community, and also wanting to be early
- If the bot is really good (inpainting and stuff) there could be a place for it when API is eventually released
- save to gallery is a big complaint at the moment
- especially for variations, making a nice “board” image as an option could be very nice
- https://dallery.gallery/dalle-openai-features-ideas-roadmap/ read again
- Idea - stickers
- Idea (Dave) - generate personalized telegram stickers
Docs drafts
/info and /start
Usage:
/info shows this message
To generate images, type “g” followed by your prompt. For example:
”g a man frustrated that he cannot find wheels for his wood project, painting by vermeer”Privacy:
This uses external services and is not fully private. OpenAI automatically shows me the last 50 generations whenever I log in (but not who generated them). Metadata is saved by a service that made it faster to build bot, but I don’t want to and won’t look at it.
ADR (Architectural Decision Record)
- tempted to use Integromat/make.com, because it seems like it could be fast
- also considering using code, as then I could share it on GitHub
- after fiddling in network tab in console it seems like actually making requests and getting images could be a bit complicated
- looked at Ming (1998)’s project again and found that it uses an npm package
- code seems like the better solution now, since the npm package could save a lot of work
- where to host it if it’s in code?
- found https://github.com/python-telegram-bot/python-telegram-bot/wiki/Where-to-host-Telegram-Bots and it seems like there are good options for hosting. Run locally now and worry later
- prob serverless (aws, google, vercel, cloudflare) or raspi
- if it’s in a serverless function anyway, maybe integromat + cloud func would be fastest?
- only write openai adjacent code for now, not telegram adjacent
- setting up a cloud function for this turned out annoying
- (seems like) google doesn’t actually have authentication, so it would have to be public
- AWS works but I couldn’t get myself to do it without having a smooth known process
- reconsidering code-only or code-mostly
- raspi
- hetzner
- cbase?
- after a moment, make.com + lambda seems like the easiest option after all. Just need to re-learn and document how to use Lambda
- make.com worked out nicely, lambda - struggling a bit to get npm packages to work
- actually doesn’t have to be in JS, so I can also rewrite it in python, which probably won’t have deps
- V1 Stack: make.com + + https://github.com/ezzcodeezzlife/dalle2-in-python
- make.com worked great (this time, finally) for telegram bot interaction and whitelist checking
- is used to run code that calls the dalle API and returns image URLs
- uses https://github.com/ezzcodeezzlife/dalle2-in-python - pretty simple, but saved a good amount of time vs reimplementing
- make.com downloads images and sends them via telegram
- make.com could maybe also be used for the part that’s on Lambda - but no big point in doing that now
- Reached operations limit on Make
- turns out that they count individual operations, not executions of a scenario
- turns out they don’t have a monthly plan despite displaying all prices per mo
- still happy that I went with Make, was very much worth it as a prototype
- Considering Lambda as hosting and Python or JS
- Lambda because already using it, and because serverless is so much easier (considering boulderbot)
- Python because already using it and know it better and to make a quick decision
- arguments for JS were: slightly better dalle package, maybe 1 other thing
- realized some considerations w.r.t. lack of state in Lambda, but the choice still seems valid
- Implemented on Lambda. Definitely had some issues, and still not 100% sure, but it seems like it’s working out now
- Database:
- considered sheetdb, gspread and DynamoDB
- there could easily be something that avoided consideration
- Chose gspread with the perspective of switching to dynamodb at some point
- sheetdb
- + would have been easy to work with google sheets
- - paid dependency
- but could have maybe written to Kuba to unlock
- gspread
- it’s cool to have direct control and insight via google sheets
- it’s familiar to use google sheets
- it’s slow and will get slower with more users
- implementation takes longer, because CRUD needs to be hand-implemented
- at least it’s pretty fun
- DynamoDB
- native to AWS
- fast
- a real database
Resources
OpenAI auth tokens
- looks like OpenAI rotates keys in some scenarios; found a posts on reddit but can’t find again
rXyadHmrvgEYJvhn8NSSvVroI15rvrApgbZA4wu6- pulled early when following instructions from Ming/1998
sess-7rTAt6HHVkeTyY5x5WXdHXqb26sw9lTdOpGKI4TI- pulled on 11.08 following instructions from dalle-node - npm
Pictures









Outdated
Updating DALL•E token
- expires every now and then, but error reporting for that case should work
- labs.openai.com → make a prompt → open network tab → find
tasks→ copysess-XYZ(..)
set -Ux D(...) (…)
serverless deploy(will push the newly set system env variable to the deployment)
