Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export all error. I seem to be hitting a rate limit #88

Closed
bryanschmidty opened this issue Mar 24, 2023 · 13 comments · Fixed by #90
Closed

Export all error. I seem to be hitting a rate limit #88

bryanschmidty opened this issue Mar 24, 2023 · 13 comments · Fixed by #90
Labels
bug Something isn't working

Comments

@bryanschmidty
Copy link

When I try to export all, many of the XHR requests fail. The response is

{
    "detail": "Too many requests in 1 minute. Try again later."
}
@EliahKagan
Copy link

EliahKagan commented Mar 24, 2023

I get what I believe to be the same issue, when attempting to export all my ChaGPT conversations (I have hundreds). Numerous 429 Too Many Requests errors occur, one per conversation, for most of the conversations.

From looking at the errors in my browser console, it seems that ChatGPT Exporter makes only one attempt to retrieve each conversation, and that it does not introduce any delay when it receives HTTP 429 errors. If I am correct in thinking this, then retrying with exponential backoff could solve the problem and facilitate exporting arbitrarily many conversations successfully. OpenAI suggests this approach when using their other APIs (though those code examples are Python-only).

@p1nkl0bst3r
Copy link

p1nkl0bst3r commented Mar 24, 2023

It tries to asynchronously retrieve all the conversations at the same time.
I just changed changed it to sequentially grab the conversations:
I did this for json export, for the others you only need to push conversation.

const conversations = [];
    var counter = 0;
    for (const id of conversationIds) {
        const rawConversation = await fetchConversation(id);
        const conversation = processConversation(rawConversation);
        await sleep(200);
        counter++
        console.log(counter)
        conversations.push({
            conversation,
            rawConversation
        });
    };

@bryanschmidty
Copy link
Author

This works for me, but I have to increase the sleep value.

  • sleep(200) and sleep(400) get me to count 50 (not sure why I can't do more with 400.
  • sleep(1000) gets me to 85.
  • sleep(1200) - count 110
  • sleep(1500) - finally got all 145 conversations.

But then I encounter a new error.

Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'map')
    at conversationToHtml (userscript.html?name=ChatGPT%2520Exporter.user.js&id=e9bbbb4c-28c6-40e2-9827-f8279c573986:15540:48)
    at userscript.html?name=ChatGPT%2520Exporter.user.js&id=e9bbbb4c-28c6-40e2-9827-f8279c573986:15525:24
    at Array.forEach (<anonymous>)
    at exportAllToHtml (userscript.html?name=ChatGPT%2520Exporter.user.js&id=e9bbbb4c-28c6-40e2-9827-f8279c573986:15521:19)

code at line 15540:

const conversationHtml = conversationNodes.map((item) => {

@bryanschmidty
Copy link
Author

bryanschmidty commented Mar 24, 2023

I didn't see this until now: "I did this for json export, for the others you only need to push conversation."

I'm exporting to HTML

@p1nkl0bst3r
Copy link

yeah just change it from:

conversations.push({
            conversation,
            rawConversation
        });

to

conversations.push(conversation);

for markdown and html

the sleep value required probably depends on how many conversations within a minute you request.
retrying and using an exponential backoff would be best here, but I'll leave that for the dev to implement

@bryanschmidty
Copy link
Author

Yeah. it's working now. thanks so much. I think a progress bar would be nice too. We're okay seeing it in the console logs, but a UI for it would be better.

@p1nkl0bst3r
Copy link

I fully agree, this is just a stopgap solution I quickly made to get it working for myself

@pionxzh pionxzh added the bug Something isn't working label Mar 25, 2023
@pionxzh
Copy link
Owner

pionxzh commented Mar 25, 2023

Thanks for all the information provided. I will check OpenAI's document and improve/handle the rate limiting.

@pionxzh
Copy link
Owner

pionxzh commented Mar 25, 2023

Yeah. it's working now. thanks so much. I think a progress bar would be nice too. We're okay seeing it in the console logs, but a UI for it would be better.

And ye, maybe a progress bar. The reason it is not being implemented is... it's just too fast when I do the small batch testing 😆

@pionxzh
Copy link
Owner

pionxzh commented Mar 25, 2023

It will look like this
image

@kerbymart
Copy link

I am not able to export all due to rate limiting, looking at the network calls, it seems it trying to download everything in parallel, so I modified the code a bit to introduce artificial delay and I was able to download all my 326 conversations:

  async function exportAllToMarkdown(fileNameFormat, conversationIds) {
    const conversations = [];
    const delay = 1500;
    for (const id of conversationIds) {
        const rawConversation = await fetchConversation(id);
        const conversation = processConversation(rawConversation);
        conversations.push(conversation);
        await new Promise((resolve) => setTimeout(resolve, delay));
    }
    const zip = new JSZip2();
    conversations.forEach((conversation) => {
        const fileName = getFileNameWithFormat(fileNameFormat, "md", {
            title: conversation.title
        });
        const content2 = conversationToMarkdown(conversation);
        zip.file(fileName, content2);
    });
    const blob = await zip.generateAsync({
        type: "blob"
    });
    downloadFile("chatgpt-export.zip", "application/zip", blob);
    return true;
  }

Well, it's not fast.

@pionxzh
Copy link
Owner

pionxzh commented Mar 25, 2023

I am not able to export all due to rate limiting, looking at the network calls, it seems it trying to download everything in parallel, so I modified the code a bit to introduce artificial delay and I was able to download all my 326 conversations:

1500ms works for you to download all of them?
I'm finding a reasonable delay :<

@bryanschmidty
Copy link
Author

Thank you so much. This is a game changer! :) :) :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants