Are your requests to ZenRows failing? A reason might be you’re trying to scrape websites associated with illegal behaviors or engaging in unethical activities. Read on to learn if that’s your case.

ZenRows prevents accessing some websites that are off-limits (including, but not limited to: financial institutions, governmental websites, payment processors, visa applications…) and certain behaviors, even on permissible sites, that can result in account suspension. Ensure you’re aware of these to make the most of our service without disruptions.

Understanding Access Restrictions

We provide a flexible web scraping API, but it’s essential to understand the limitations set primarily for legal or ethical reasons.

Restricted Websites Categories

  1. Banks: Websites of banking institutions.
  2. Credit Card Processors/Payment Gateways: This encompasses VISA, MasterCard, PayPal, Stripe, and similar payment platforms.
  3. Visas, eVisas, and Permits Websites: Platforms related to visa application or issuance.
  4. Governmental Websites: Mostly, those with .gov domain extensions.

We also retain the discretion to block other sites. The driving criterion is legal or ethical concerns. If you believe a site is blocked in error, or there’s a legitimate reason for accessing content from a restricted site, please contact us. We always welcome feedback and will conduct a review based on your appeal.

Encountering Restrictions - What to Expect

If you target a restricted site, ZenRows API will return the following error:

  • HTTP Status Code: 400 Bad Request
  • Response Body:
{
  "code": "REQS001",
  "detail": "Requests to this URL are forbidden. Contact support if this may be a problem, or try again with a different target URL.",
  "instance": "/v1",
  "status": 400,
  "title": "Requests to this domain are forbidden (REQS001)",
  "type": "https://docs.zenrows.com/api-error-codes#REQS001"
}

User Behavior Guidelines

Certain behaviors can lead to account suspension due to ethical, legal, or operational reasons:

  1. Brute Forcing Login Forms: Prohibited due to security concerns.
  2. Heavy Traffic Burden: Overloading sites disrupts their operation. Respect the robots.txt file and avoid aggressive scraping.
  3. Scraping Copyrighted Content: Without permissions, this is legally contentious.
  4. Extracting Private Personal Information: Against GDPR and other privacy laws.
  5. Misrepresentation: Tactics like spoofing user-agents or using proxies unethically are discouraged.

Recommendations and Best Practices

  1. Handle Errors Gracefully: Build error-handling into your systems to manage potential restrictions seamlessly. If you get an HTTP 400 status code, do not retry your request to prevent being permanently blocked.
  2. Reach Out: Our support team is always ready to help, clarify, and listen.

We aim for a balance between powerful web scraping and ethical web behavior. By respecting these guidelines, we can create a sustainable and beneficial web scraping ecosystem. For any queries or clarifications, our support team is here to assist.

Frequently Asked Questions (FAQ)

Was this page helpful?