dfonseca • 9 months ago
    dead project
    Wanderley Patrick • 1 year ago
    Retune is alive?
    Jeremy Chambers • 2 years ago
    gpt-trainer.com already has this. They also have a lot more features, and their team is a lot more responsive. I highly recommend checking them out.
    Grande Fromage • 2 years ago
    RE:Tune is RE:tired. RIP. Thanks for taking my money and giving me abandonware.
    Sam Carter • 2 years ago
    Hi. Every website scrape I have done so far says it has failed. Why is this and can it be fixed?
    Bram Molmans • 2 years ago
    periodically check or set-up the API that we run the sitemap as soon as a new blog/page has been added
    pchlada • 2 years ago
    Please also if the bot can check it periodically for new content to learn from...
    Patrick OConnell • 2 years ago
    I need the chatbot to be able to crawl a prospective client's website.
    Mirza Iqbal • 2 years ago
    Seems like dev team is abandoning the response’s
    Jeremy Gray • 2 years ago
    I just started, but I scraped my URLS from the sitemap, used GPT to strip to just the URL and then uploaded. Would be great to simply enter that URL and it work!
    Jason Roach • 2 years ago
    Also, in addition to prior mention of filtering (wildcard inclusion/exclusion) option to dynamically exclude if page has noindex unless on inclusion list.
    jayson patrick • 2 years ago
    I totally agree with the idea that the database changes according to the page content's change.
    Andre Lopes • 2 years ago
    Any news on this?
    Chris Tucker • 2 years ago
    Definitely need to be able to update links when content is changed. This would drastically improve the workflow.
    Cagri Sarigoz • 2 years ago
    It would be great if the bot finds new URLs in the sitemap and continues to train itself with the new content. :)
    SEO Master • 2 years ago
    ABOUT SCRAP URL: need to clic one by one if i want to scrap all site otherwise it fail. it should be better to just clic once et scrap all urls and after thatb delete what we do not need ie easier and faster. and if one failed , add the possibily to relaunch or delete.
    mybeat • 2 years ago
    This would be huge! With basic filtering in / filtering out based on URLs. E.g. contains, doesn't contain, regex match
    ia • 2 years ago
    Also being able to give descriptions with the sitemap so they can be used to give the best fitted urls to people asking the bot
    me • 2 years ago
    A basic filtering and possibility to exclude pages or best wildcard inclusion and exclusion
    freddiechip5 • 2 years ago
    This along with showing how much data or vectors were extracted from each website. Sometimes I input URLs but not sure if the system was able to get any data.
    lukasz lothammer • 2 years ago
    Preferably sitemap with possibility to do some basic filtering in / filtering out based on URLs. E.g. contains, doesn't contain, regex match
    support • 2 years ago
    XML sitemap would be huge!
    gregborn1 • 2 years ago
    This would be sooo helpful!
    Kenny • 2 years ago
    XML site map please