Archive
Complete
- maybe also move normal aggregate to code blocks and see how it looks?
- aggregate report changes too much send an alert
- test branch
- special syntax for either or ?
- turn first chromedp falover into warn
- you have to make hte custom ack to interaction respond not this ------ still says not respond, investigete later
- raspberrypi os is pissing me off, switch it into an arch fork or sth
- i changed the ebay price to the lowest every recorded price, so theres gonna be a lot less ebay listings now
- setup query logs command
- maybe will have to add a normalized price bc some shit apparently has a lot of scammers
- figure out how to add size, eh just putting it in the name is fine
- add support for picking what type of item from ebay? used only, open box etc…
- Maybe also add automated amazon crawling?
- maybe remove shutdown?
- bug: cant manually shut down rn bc docker compose restarts it
- if it gets unmanagable and unreadable maybe do 2 get alls, one detailed with everything, and one only with current price, no sources etc
- im gonna kill myself ebay has an API and I didnt need to scrape it?
- or maybe change the item architecture to support variants, item categories? current price tree?
- remove message test on deployed pi, it returns a 64 bit
- if DB unexpectedly exits, ping the discord, with the err msg before going down
- add auto complete for uri filed
- will have to unrwap the whole thing, and will have make a new aggreagation pipeline for it
- deprecated, you cant search unwound documents, have to just do a normal search and return the results of that
- highkey the update functions are very messy, i shouldve probably made a chart
- next comes stocks and all that but have to watch the quant stuff first ahahhahha me likey or just make a separate bot so i dont have to redeploy this thank you very much
- maybe add price changes in general? or have a toggle that enables it? for sensetive items that i want
- add cents to price value?
- different websites handle it differently,
- new egg has cents in a whole new element
- amazon does too
- have two html queries? one for cent one for dollar amount
- would have to change the graph and database to handle decimals
- kind of a pain in the ass highkey
- different websites handle it differently,
- Go markdown generator for table view
- web crawler with arguments, gets link, and the value of the HTML box it wants to track, if it doesnt find the thing it sends an error through the discord bot, web crawler database operations for price tracking and aggregatino, second index etc
- maybe change so that the uri list has ids for quality of life
Done
Complete
-
add additional name info to get query
-
make it so that it does the ebay stuff for the queries separately?
-
refractor all the random maps into an item
-
make a good deal Algo? 1. [ ] Color Code new listings based on their price relative to the aggregate reports
-
channel length is out of sync, check for negative length in update channel length
-
add multi search word functioonality to items - basically the sub item thing i wanted to do
- discord command for add
- discord command for remove
- DB command for Add
- DB command for Remove
- pass in to secondhand handler
- Pass in to titleCorrectness Check
-
change amazon html tag
-
will have to add support for channel property change, just override the object and change its properties?
-
channel will be werid, i think it might double up or sth check tmrw
-
bug too pointers weird with update price alert
-
get details function, triggers the errorcrawl alert and sends the requested files? 3 options, general, ebay, facebook
-
the requests are kind of close to each other, set it to like randome 10 min or sth
-
add lowest price change to the slog of scheduler
-
add err to chromedp
-
edit timer still messy
-
the second hand price wont take effect unless i also add it as a condition for the status change
-
put a manual set price → removes all trackers, setsup the last available price with the new requested price
-
bug: listing history when array is null for aggregation pipeline, also recheck the update price history logic
-
also add price increase notifications for normal trackers
-
hero picture fall back for amazon?
-
refractor stealth actions to be more readable
-
add get channel info
-
add html to facebook and ebay failovers
-
also add proxy pictures for default and facebook and ebay?
-
bug: edit_tracker crashed last time, check if its still not working, not working because of values, will have to switch to indexes?
-
add tracking list change support
-
change handle ebay listing name
-
found best buy change all the trackers
-
might have to change facebook error
-
bug: amazon it used to work but now it doesnt
-
on next deployment, update channel map when updating setup
-
change the name of the setup function to be both update and create? or make into two function
-
add html return to chromedp
-
add proxy to chromedp
-
Screeshot for errors is still not reliable
-
change the avg price and aggregate price of when sold one, maybe its fine actually idk
-
recheck duration logic
-
add does accept offers?
- normal ebay crawl
- chromedp crawl
- response
- types.ebaylisting
-
new listing old format
-
ebay proxy failover options
-
make it so that price change only fires if the difference is more than 5 bucks?
-
invalid timer check
-
avg price when sold, change it so that it doesnt include items that are still available
-
add proton vpn for IP rotation https://hub.docker.com/r/genericmale/protonvpn https://hub.docker.com/r/qmcgaw/gluetun
-
Total price change field
-
fix autocomplete for timer in add
-
formatter for second hand, make sure it doesnt give duplicate
-
Best buy query is faulty
-
add number of price reductions and price increases to second hand listing information? pretty useful for leverage determination to make offers
-
Maybe have a dummy message that just sends the acknowledged message as an embed?
-
updated listing logic testing, add 25 limit to add, per channel update for add update for remove
-
Bug: scheduler logic doesnt update when suppress changes
-
change logic for ebay listings?
- all listings for the item in the same date
- put them in a map
- for loop, if the url is in the map, delete from the slice
- update
-
ebay failover
-
maybe also add tax to the prices?
-
add custom timer intervals
-
Refractor scheduler to update scheduling functions everyhour and delete or add go routines that are out of date
-
add best buy support to the image grabber
-
converet logs to slog
- add logging module
-
logs with graphana and loki
-
Setup grafana prometheus
-
Setup loki
-
Setup alloy
-
marketplace price logic bug?
-
add suppress noti flag?
-
fix facebook market place location
-
add read to ebay regex
-
add stdev filtering into the second aggregate pipeline and add Read
-
add compare method? render chart and send picture like graph function → compare aggregates
-
compare graphs has a weird bug that some dates are at the start even though they are chronologically at the end
-
fixed color code error
-
error messages get mixed up, and it sends chart? for somer eadon
-
OLAP for used items
-
testing
-
refractor database into aggregate file
-
can write a compare time frames aggregate function
-
1. Info I want to add to the Item view 1. [ ] 7 day aggregate
-
Used Item Stats:
- Information I want to extract in the pipeline:
- how long historical listings lasted
- their price at time they sell
- filter abnormal ones
- stdev
- Information I want to extract in the pipeline:
-
for the get if everything returns but webhook errs out, return a normal message afterwards
-
Add aggregate used data to graphs price history
- might have to redo the scheduler get price logic but it would def be worth it
-
maybe add kids and junior as exclude words
-
test
-
depop support
-
Add item types
- on add
- have a check in second hand, if its clothes add depop crawler to the mix
- depop crawler
-
refractor ebayHandler
-
Bot Management:
- setup:
- take location and market place miles as input
- update DB
- load miles and location from DB on init
- Channel Delete:
- delete channel table on delete from DB
- setup:
-
list empty
-
sort alphabetically in autocorrect return
-
maybe also add a message length calculator and the field thing to make the spread more reliable
-
add price sort when putting the stuff in
-
pass in the last days title to not needlessly get the link of already existing items
IE move dedup logic into the ebay submodule
-
i dont need a dedupe crawl its already there ahahahah
-
figure out meshify 3 new egg and chromedp thing, it has an add into cart to see price which is very stupid
-
it shouldve filtered the bid on asrock check if its not gone will have to look at the logs or sth
-
bug: check the pricing it turns to zero somewhere somehow, yup still taking everything as a price change for some reasn
-
add date to current lowest in embed
-
style price update and new price found using the embed thing?
-
1. [x] calculate size as building embed 2. [x] if number of fields or max size 3. [x] split into multiple embeds 4. [x] return list of embeds
-
30 item list, once list too big
- add pagination
- add concise list method
-
messed up autocomplete
-
add multi channel support for separation of concerns? → might wanna add cars, other types of tech, people etc
-
would also need to add the indexes in the make channel collection section
-
theres a 3 index limit per cluster for free atlas, either will have to self host it
-
not pushing new channel ids
-
Multi Channel Support:
- new DB table to keep track of ID and table name
- on init load tables into memory
- make func for if a channel is new → create new table in DB
- change database functions to take in the name of the channel
- load it from the memory bit
- Change Scheduler for running the schedule on a table basis → get all channels first and then then do a schedule for each
-
ok channel id is interaction based…
-
maybe make it so that it sends the screen shot of the failover to the discord
-
check if err propogates properly/ notify discord
-
change content for facebook marketplace, to also include formatted time and distance
-
add geo api to doppler
-
for now just make it work, later on, i will add the distance api stuff
-
face book market Place:
- url generator
- price extractor
- URL extractor? depends wether it has the listing url by default
-
add edit name
-
ebay read err propogate
-
lol i crawled it so many times back to back it got throttled, lets hope i dont have to run it through aws lmfao
-
refractor discord module into hooks, messages, formatting
-
add support for ebay used items
-
url way too long, maybe look crawl individual pages and see if they havea og:link or sth like that
-
regex a bit messy, discord messages with the urls are too long?
-
the names cant be shit now since there is regex based on them - one word denominators like 3 and x are very important with spaces around them, this kind of just rawdoggs them
-
Ebay Reqs: Discord:
- discord new listing alert
- discord listing price update
- Add embed for ebay listings
-
Ebay Reqs: Scheduler:
- get old and new listings
- compare listings
- if listing gone do nothing since whole array is updated
- if new listing found not in the previous crawl ping db
-
Ebay Reqs: DB:
- Get EbayListings Method
- Save EbayListingsMethod
- Get listings on add
-
Ebay Requirements: crawler
- returns listings
- regex verification,
- llm verification
- returns listings
-
test amazon image
-
handle amazon link for automatic embeds?
-
change remove message?
-
test message embed and query selector
-
add picture embed for all items
- get open graph(og) img url of the first link inserted
- save in DB
- return as a field for embed
-
database logic too big, split autocomplete stuff into its own file
-
make the edit function more readable
-
take channel id out of hard code
-
nvm autocomplete for remove cant even be done, it doesnt support urls more than 100 chars
-
delete the autocomplete index for the uri
-
add one for query selector too? this one can honestly just be a hardcoded json? add a map for autocomplete with amazon, newegg, microcenter name, value pairs
-
add autocomplete for all name fields
-
mongodb access is based on ip
-
doesnt support empty fuzzy search, handle it and return all instead?
-
add auto complete
- auto complete command pallete
- mongodb text search index plus fuzzy finder
-
Chart: make background white, legends is unreadable in discord, plus legend padding and break new line, when url too long it pushes legend icons out of bounds
- title too big
- title not centered
- legend padding overflowing
- overlapping series look if any options to make it readable?
-
bug: graph error handling: if it cant find it it will send the previous graph generated
-
regex for graph query
-
bug: scheulded crawling isnt happening for some reason
-
commands are too long, shorten names?
-
move cancel context channel logic to main
-
if it thinks its zero, change it or skip update price?
-
bug: its logging each crawl like 5 times
-
maybe have it have more frequency of gathering the prices (twice daily) but also schedule a weekly price compression that also runs everytime i redeploy
-
maybe for name matching add fuzy or lower case stuff?
-
remove the hiiii price tracker command somehow
-
add other cases i wanna keep track of
-
amazon links behave weird they arent fully rendered on send
-
when adding trackers or removing them it returns the old version of the document
-
formatting bug
-
padding is kind of ugly maybe have a total length and a helper function that pads the left and right automatically
-
formatting with dollar sign front end and rounding up?
-
it thinks current lowest price is 0
-
current price, historical lowest price
-
a lot of them have the dollar sign at the front, so maybe filter for that in the crawler .string function so i dont have to do much, microcenter and best buy will have been supported by this
-
also update current lowest price on price validation
-
methods that have to call the crawl to validate the price will take longer make it send an ack first and then edit like the chart was doing
-
Right now add price tracking info does return the price of the newly added page, but the Item page from mongo, should i change it to add that? its fine for add, but i would have to change edit tracker to also return a price object
-
1. [x] add current lowest price in the document, update it in the crawler by setting to max int in the tracking array 2. [x] add it as a response back for get item,
-
check error handling? missing fields in insert and edit,
- valid uri check for uri field
- look if go has a native uri type
- if a new uri added make sure that the crawler can access it
- return current price as first lowest price when adding
- circuilar dependancies might have to relearn interfaces
-
Embeds for all methods that return items?
-
bug: headless docker version doesnt work for some reason
-
use embeds to format get all fields maybe also for get single item field? if it looks good
-
uncomment crawler when done debugging
-
charts doesnt work with the binary version on pi?
-
github actions for pi deployment
-
Price Chart
-
Bot:
- get price graph for the last n months
- acknowledgement of request
- send the generated chart after its done
- it has to read the image from io and then send it and delete it
-
graphing module
- gracefull shutdown
- figure out the axis if it needs an array or objects is fine?
-
update lowest price might not be working
-
DB:
- get full price history for the last n months
-
put the context in main and let all the subroutitines inherit it from there
-
Test everything lmfao
-
write logs for all functions
-
Crawler:
- crawl method with link and html element
- update DB if found
- if not update with error through discord
- call get all projects
- iterate through projects and call init project tracking, takes project name and tracker list, do this on once a day timeout
- init project goes through and does random one hour timeouts to sparse out the requests and crawls the individual lists
- no lowest price default value handle empty error
- crawl method with link and html element
-
Web Crawler Implementation: Database:
- get tracking list method for a project
- add new price to the DB
- if the lowest Price Notify via bot function
-
main:
- call init crawler
-
alright i need there to be data so that i can do a pipeline on them
-
it has field called projection for specifying what to return, dont return the date and price for the get stuff
- change query for discord bot get queries to not return price history
- only return pricer tracking html list for crawler get method
-
add price,date object and into the overall object structure
-
error handling for missing or non returning values for DB functions, rn it just panics
-
set up the nosql database for it to store and query from? does it have to be relational actually? maybe it does, look up mongoDB for more
-
basic db functinoality is up
-
make the command function structure
-
make the command structure
-
get the options part of the command working
-
get the slash command line to work
-
get the client to connect
IP
- add and setup cadvisor
- add verbose per item per item for sending pictures of failovers
- add firsttime database setup?
- make aggregate tables for all items in the channel https://github.com/jedib0t/go-pretty/tree/v6.7.8/table
Planned
- 1. get logs is useless since it doesnt record the data unless it fails 2. it would be waay too many reads and writes to the DB, so that wouldnt be really anygood 3. maybe add it to an S3 bucket or sth but thats just waay too overly complicated 1. if i do do this i can send in the error images and html files as S3 urls 4. maybe ill do this once i get a mac mini that has an actual ssd in it
- round robin multiple proxies?
https://brightdata.com/pricing/proxy-network/residential-proxies
its 4$ per gb, either block image rendering on normal requests to prevent extra network consumption to make this feasable bc rn it does like a gb of data per day
- add container statistics to collect how much network I/O the container uses
- 1. [x] buy second pi, 4gb 60$ new 1. [x] wait for used 2. [x] buy switch with POE 3. [x] POE Cables 4. [x] SSD? Not rn 5. [x] SD card 6. [x] move pis into POE 7. [x] print enclosure for new PI + locks 8. [ ] mongoDump to rp1 9. [ ] mongoDump to rp2 10. [ ] change driver to use replication 11. [ ] nvm cant move mongo to rp4 it doesnt support the architecture, have to get a minipc or a 2 rp5s 12. [ ] once this migration is done also add selfhosted runners, compile times are ridiculous on github actions
- bug: best buy also sometimes behaves very weird
- K8 for centralization of logs deployment
- i think were done for now
- it is approaching a MB now which isnt much at all
- handler to reverse engineer market place tags if brittle
- Added to new Server → send init message
- Formatted aggregat tables for comparison instead of returning separately?
Archive
- bug: autocorrect for html for editadd tracker