gpt-trainer.com already has this. They also have a lot more features, and their team is a lot more responsive. I highly recommend checking them out.
Mark Pivon • 19 days ago
RE:Tune is RE:tired.
RIP. Thanks for taking my money and giving me abandonware.
Sam Carter • 4 months ago
Hi. Every website scrape I have done so far says it has failed. Why is this and can it be fixed?
Bram Molmans • 7 months ago
periodically check or set-up the API that we run the sitemap as soon as a new blog/page has been added
pchlada • 7 months ago
Please also if the bot can check it periodically for new content to learn from...
Patrick OConnell • 7 months ago
I need the chatbot to be able to crawl a prospective client's website.
Mirza Iqbal • 7 months ago
Seems like dev team is abandoning the response’s
Jeremy Gray • 7 months ago
I just started, but I scraped my URLS from the sitemap, used GPT to strip to just the URL and then uploaded. Would be great to simply enter that URL and it work!
Jason Roach • 8 months ago
Also, in addition to prior mention of filtering (wildcard inclusion/exclusion) option to dynamically exclude if page has noindex unless on inclusion list.
jayson patrick • 8 months ago
I totally agree with the idea that the database changes according to the page content's change.
Andre Lopes • 8 months ago
Any news on this?
Chris Tucker • 8 months ago
Definitely need to be able to update links when content is changed. This would drastically improve the workflow.
Cagri Sarigoz • 8 months ago
It would be great if the bot finds new URLs in the sitemap and continues to train itself with the new content. :)
SEO Master • 9 months ago
ABOUT SCRAP URL: need to clic one by one if i want to scrap all site otherwise it fail. it should be better to just clic once et scrap all urls and after thatb delete what we do not need ie easier and faster.
and if one failed , add the possibily to relaunch or delete.
mybeat • 9 months ago
This would be huge! With basic filtering in / filtering out based on URLs. E.g. contains, doesn't contain, regex match
ia • 9 months ago
Also being able to give descriptions with the sitemap so they can be used to give the best fitted urls to people asking the bot
me • 9 months ago
A basic filtering and possibility to exclude pages or best wildcard inclusion and exclusion
freddiechip5 • 9 months ago
This along with showing how much data or vectors were extracted from each website. Sometimes I input URLs but not sure if the system was able to get any data.
lukasz lothammer • 9 months ago
Preferably sitemap with possibility to do some basic filtering in / filtering out based on URLs. E.g. contains, doesn't contain, regex match