-
-
Notifications
You must be signed in to change notification settings - Fork 163
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to decode WakaTime dump issue #542
Comments
Hi @uzza1hossain!
You can't. Importing data is only supported via the built-in import mechanism. Which Wakapi instance did you use? A self-hosted one or Wakapi.dev? If you used a self-hosted instance, do you see any additional hints on why the import had errored? If it was on Wakapi.dev, can you please send your username, so I can do some digging? |
I am using self hosted instance. |
The wakapi instance download the file from wakatime then clean the data and upload it to db. I assume that's the general workflow. Is there any way I can provide the file manually inside docker container? I mean the main issue is here downloading the file. Thanks for your prompt reply. |
What sort of errors did you get?
Currently, there isn't. The import process also involves a couple of requests against the WakaTime API in addition (e.g. to resolve machine UUIDs to actual names, etc.) so it's not as simple as to just upload the JSON manually. But, of course, we could still add this as a feature. I'm a bit reluctant, though, because it would require some implementation effort and increase code complexity, only for a rather "niche" use case. It wouldn't even be an actual feature, but rather only a way to work around some other feature not working as intended. So I'd rather want to fix the current import mechanism. If we can learn more about why it's failing, we can perhaps account for that (e.g. increase timeouts or something). |
I will try again once 24 hours limit is over and will provide full log. Thanks a lot |
On a self-hosted instance, you can also just skip the time limit by adapting |
[ERROR] failed to decode data dump for user 'username' ('https://wakatime.s3.amazonaws.com/coding-activity-exports/long-api-key-from-wakatime/wakatime-emailgmail.com-longapikey.json?AWSAccessKeyId=exampleaccesskeyid&Signature=examplesignature&Expires=1699288339') - context deadline exceeded (Client.Timeout or context cancellation while reading body) |
I saw that error before and have no idea why it's occurring. Even a file as bug as many hundreds of megabytes shouldn't take 10 minutes to download and parse (unless you have a really slow connection?). How big is your particular file? |
It's around 180Mb. My connection is good for all other services. But I can't download this particular file. As a said before, it took 10 try to download manually. I was only able to download this file using aria2. Everytime i try to download from chrome it’s failed after sometimes. |
Can you describe how that "failing" looks like? Do you get an error message? Do you see an infinitely spinning loading indicator? |
Download start with normal speed. Then it’s start decreasing the speed of download. After sometimes its stuck in "Resuming download" |
Would you send me the download URL to your dump via e-email (see my profile page), so I can try to do some debugging with that? No worries if not, I totally understand that coding activity is somewhat personal information. |
I tried to import via settings > Integrations > Import Data. But it's failed (download failed).
I downloaded the json file after many tries (s3 is highly slow in my case).
I can't import again because it's says you can do it after 24 hours.
Which api endpoint should I use to upload the data from my downloaded json file? (I can see there is many endpoint for heartbeats).
Also data format in downloaded json file and data format api expect isn't same. How to transfrom data so that api can accept?
I assume you also clean the downloaded data before adding to db.
Thanks for the help.
The text was updated successfully, but these errors were encountered: