Skip to content
This repository has been archived by the owner on Dec 2, 2019. It is now read-only.

Many new API endpoints, queue for requests to reduce request limit errors, updated model interfaces #36

Closed
wants to merge 51 commits into from

Conversation

JumpLink
Copy link
Contributor

@JumpLink JumpLink commented Aug 21, 2019

Hello, we have a bigger update for you. We hope the pull request is accepted by you. So that we do not move further from the main branch.

Work

  • Implement a queue logic to reduce the too many requests (code 429) error
  • Updated many interfaces by removing the optional parameters
  • Updated many interfaces by fixing the property types / names
  • Instead of making all the properties of a model interface optional, the Partial interface is now used
  • Some models got a new interface with the affix UpdateCreate where it is known which parameters are required or if subobjects whose properties must also be optional (In the long term, all models should get such additional UpdateCreate interfaces)
  • Added some new Shopify API endpoints

Breaking Changes

If someone has imported the model interfaces to his project then these must be adapted, so we recommend a larger version jump.

Note

Not all new endpoints have been tested, but should be a good basis. The endpoints Pages, Products, Orders and Transaction and some others have already been used by us productively and should have a good state

Moritz Raguschat and others added 30 commits December 10, 2018 18:56
@nozzlegear
Copy link
Owner

Wow, this is impressive! Thanks for the pull request, I'll need a couple days to go over all of the changes. Please feel free to ping me if I still haven't merged this in by Monday.

@trollkotze
Copy link

The queue thing is quite dirty and hacky. I never intended that to be merged into upstream in that form.
But some way to keep an eye on the API call bucket limit would be essential.

Shopify sends a response header "x-shopify-shop-api-call-limit" with every API response, informing about the state of the leaky bucket associated with the access token for which the request was made.
That header was just discarded by ShopifyPrime previously, only returning the JSON payload on a successful request.
So I added a map of "CallLimits" structures as a static member of BaseService to keep track of the bucket states for all tokens. After each successful response, the "CallLimits" associated with the access token in question is updated.
If the bucket fill state is running low, requests are delayed and queued to give the bucket time to be replenished.

The mechanism is a bit hacky and maybe not optimal to keep hard-coded like that. The numbers in there are a bit arbitrary and "over-cautious", because I wanted to keep it most frictionless with two app instances using the same access token at once (none of them knowing when an API call was made by the other one).

Maybe it is best to keep that queuing-and-delaying mechanism out of Shopify Prime, or make it optional, or more configurable at least. But knowing the call limits info is really quite important sometimes to make an app with potentially high request rates run smoothly.

@JumpLink
Copy link
Contributor Author

@nozzlegear if If you want, we can make queue optional and configurable (or completely remove if you think that's right). What about the other changes, e.g. at the interfaces which now use the partial interface etc?

@nozzlegear
Copy link
Owner

nozzlegear commented Aug 28, 2019 via email

@trollkotze
Copy link

I'm looking into it. Maybe will get to it on Sunday.

@nozzlegear
Copy link
Owner

Thank you again for this pull request and all of the effort you put into it. Unfortunately I've decided to deprecate this package and repository due to time constraints on my own part. The code is MIT licensed, so please feel free to continue using/improving it!

@nozzlegear nozzlegear closed this Nov 27, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants