How do I Export Data to CSV?
Once you’ve extracted data using ZenRows, you might want to store it in CSV format. For simplicity, we’ll focus on a single URL and save the data to one file. In real-world scenarios, you might need to handle multiple URLs and aggregate the results.
To start, we’ll explore how to export data to CSV using both Python and JavaScript.
From JSON using Python
If you’ve obtained JSON output from ZenRows with the autoparse
feature enabled, you can use Python to convert this data into a CSV file.
The Pandas library will help us flatten nested JSON attributes and save the data as a CSV file.
Here’s a sample Python script:
You can also adjust the json_normalize
function to control how many nested levels to flatten and rename fields. For instance, to flatten only one inner level and remove latLong
from latitude and longitude fields:
From HTML using Python
When dealing with HTML output without the autoparse
feature, you can use BeautifulSoup to parse the HTML and extract data. We’ll use the example of an eCommerce site from Scraping Course. Create a dictionary for each product with essential details, then use Pandas to convert this list of dictionaries into a DataFrame and save it as a CSV file.
Here’s how to do it:
From JSON using JavaScript
For JavaScript and Node.js, you can use the json2csv
library to handle the JSON to CSV conversion.
After getting the data, we will parse it with a flatten
transformer. As the name implies, it will flatten the nested structures inside the JSON. Then, save the file using writeFileSync
.
Here’s an example using the ZenRows Scraper API with Node.js:
From HTML using JavaScript
For extracting data from HTML without autoparse
you can use the cheerio library to parse the HTML and extract relevant information. We’ll use the Scraping Course eCommerce example for this task:
As with the Python example, we will use AutoScout24 to extract data from HTML without the autoparse feature. For that, we will get the plain result and load it into cheerio. It will allow us to query elements as we would in the browser or with jQuery. We will return an object with essential data for each car entry in the list. Parse that list into CSV using json2csv
, and no flatten is needed this time. And lastly, store the result. These last two steps are similar to the autoparse case.
If you encounter any issues or need further assistance with your scraper setup, please contact us, and we’ll be happy to help!
Was this page helpful?