Rate Limits
The Everee API uses rate limiting to protect its stability and reliability for all our customers. When clients send too many requests in a short period of time, they may receive an HTTP code 429 error, indicating that no more capacity is available to handle their requests for a short time.
- The rate limiter allows up to 50 operations per 5-second window (read or write) by default.
- This limit can change to help make sure our systems are protected from abuse.
- Some API endpoints may have special limits. If an endpoint has a special limit, the limit is included in the API endpoint's reference documentation.
Handling rate limits
In order to handle rate limiting in a smooth way, your application should watch for HTTP code 429 responses from the Everee API and retry the operation if one occurs.
The simplest approach is to retry after a short delay, but since rate limits are applied to a period of time (one second, for instance), retrying a large number of failed operations after a fixed period of time can lead to repeated failures and delays or instability in your application.
A better method is to back-off retries exponentially and include a small amount of randomness (slightly more or slightly less time) on each try. This approach helps to "spread out" failed requests over a short period, which maximizes the likelihood that they will succeed quickly.
There are well-used, mature libraries available in all major languages to help implement a backed-off retry approach when working with rate-limited APIs.
Queuing requests & bulk endpointsA general approach to scalable integrations is leveraging queue systems whenever possible. This helps ensure reliable eventual consistency between systems, and allows your system to gracefully handle all kinds of errors and limits that exist in real-world scenarios. It also lets your system interact cleanly with APIs that communicate rate-limiting statistics, like Everee's API (see "Rate limit metadata" below).
It's not practical to queue all requests, since some need to happen atomically as part of synchronous operations in your system. These are often part of specific user actions. That's completely fine, since user actions are often lower volume than synchronization or scheduled bulk operations. Generally, you should try to queue requests when you don't need them as part of an atomic user action.
Finally, you should also use the endpoints labeled as "Bulk" whenever you can. These endpoints allow you to work with much larger datasets than single-item endpoints, and let you use far less of your rate-limit quota to move much more data.
Rate limit metadata
The Everee API communicates the current rate-limiting status to users using the following HTTP response headers:
| Header name | Explanation |
|---|---|
RateLimit-Remaining | Remaining operations allowed in the current time window. |
RateLimit-Reset | Number of seconds remaining before more operations will become available. |
RateLimit-Limit | Maximum number of requests allowed for each time window. |
RateLimit-Policy | The rate-limiting policy in effect for this endpoint, as defined by the IETF draft RFC. |
Updated 16 days ago