How Can I Speed Up My Requests?
Optimizing your requests can significantly improve performance and reduce response times. Below are general best practices, followed by specific recommendations for each ZenRows product.
Factors Affecting Request Speed
- Concurrency: Sending multiple requests simultaneously can increase throughput
- Resource Usage: Features like JavaScript rendering or waiting for specific conditions can impact speed
- Response Size (optional): Pages with dynamic content will naturally take longer to load. Consider targeting only the necessary data or using output filters to minimize payload.
- Success Rate: The rate of successful requests. If the success rate is low, you may need to increase the number of requests or the concurrency.
Monitoring Concurrency Usage
Each response includes headers that help you manage and optimize your concurrency:
These headers help you:
- Monitor how many concurrent requests your plan allows
- Track how many slots are currently available
- Adjust request volume dynamically to avoid hitting limits that may delay or throttle requests
429 Too Many Requests
error and your IP can be temporarily blocked for 5 minutes. If you receive a
BLK0001
error (IP Address Blocked), it means your IP has exceeded the allowed error rate. The block will last for 5 minutes and will impact your ability to send new requests during that time, affecting your overall scraping speed. For more details, see our API Error Codes documentation.Use these headers to adjust your request flow in real-time, scaling up when possible and backing off before hitting limits.
Product-Specific Recommendations
Universal Scraper API
-
Optimize JavaScript Rendering:
- Disable
js_render=true
for static content to improve speed - Only enable when necessary for dynamic content or accessing protected content
- Consider the impact on response times
- Disable
-
Minimize Wait Times:
- Use
wait
andwait_for
only when required - Set the minimum necessary wait duration
- Longer waits mean slower requests
- Use
-
Use Premium Proxies:
- Enable
premium_proxy=true
for faster, more consistent responses - Particularly useful for sites with anti-bot measures
- Can reduce retries and improve overall speed
- Enable
Scraper APIs
-
Concurrency Management:
- Start with moderate concurrency and monitor response times
- Increase gradually while maintaining acceptable speed
- Implement backoff strategies when requests slow down
-
Parameter Optimization:
- Remove unnecessary parameters that might slow down requests
- Only use parameters essential for your use case
- Monitor the impact of each parameter on response times
Residential Proxies
- Request Rate Optimization:
- Monitor response times at different request rates
- Adjust based on target site performance
- Implement backoff when responses slow down
Scraping Browser
-
Resource Management:
- Disable unnecessary JavaScript execution
- Block non-essential resources (images, media, etc.)
- Optimize browser settings for faster loading
-
CAPTCHA Handling:
- Implement manual CAPTCHA solving to avoid automated delays
- Consider the impact on overall request speed
Speed Optimization Best Practices
- Start with Baseline: Begin with standard settings and measure response times
- Monitor Performance: Use response headers and timing metrics to track speed
- Gradual Optimization: Make incremental changes and measure their impact
- Smart Retries: Use exponential backoff for failed requests to maintain speed
- Target-Specific Tuning: Adjust settings based on the specific website’s performance
Remember: While these optimizations can improve request speed, some features (like JavaScript rendering) might be necessary for your specific use case. If you need help optimizing for speed while maintaining functionality, our support team is here to assist.