Trial Plan

$0.00only 5.000 pages

  • 1 User
  • 5,000 Pages
  • Up to 50,000 records
  • 100 projects
  • Less support

Standard Plan

$10.00Per 100,000 pages

  • Expired: Unlimited
  • 100,000 Pages
  • 1 Million Records
  • (Up to 100k Pages x 10 records)
  • 5 GB Storage
  • Unlmited projects
  • Life time uses
  • Support with Skype
  • -

Professional Plan

$20.00per 250.000 pages

  • 250,000 Pages
  • 2.5 Million Records
  • (Up to 250k Pages x 10 records)
  • 15 GB Storage
  • Unlmited projects
  • Life time uses
  • Support with Skype
  • -

Enterprise Plan

$30.00per 500.000 pages

  • 500,000 Pages
  • 5 Million Records
  • (Up to 500k Pages x 10 records)
  • 30 GB Storage
  • Unlmited projects
  • Life time uses
  • Support with Skype
  • -

Use Cases

Overview

Both individual and enterprise can use our SaaS ( Software as an Service ) to collect text, files, images through many pages!

Visual Scraper Agent is a software that you use it to point and click which text, files, images to collect. You can teach our software to extract exactly what you want on the web. We call the project of the agent is "Agent".

Web Console is a website that you can download/export , manage your agents/projects online.

Agent Support is like a helpdesk for you to make any questions so that we can help you to extract data on your agents.

Features

No monthly cost. Your account depends on your Page Credits.

A page is any individual request to a website to load a web page or any new portion of a web page, necessary for an project to complete navigation or data gathering. Page credits are non refundable/transferable, and can be used as long as your account is active. Unused Professional Account pages roll over to subsequent months.

Automate data extraction from any website. Scrape as few or as many pages as needed to create valuable data files, structured how you want it.

Visuascraper gives you the ability to schedule your projects to run at any given time or time interval. Your computer doesn’t even need to be on to run your projects.

In just seconds you can send all the data you have captured directly to your desktop in CSV, TSV, or XML format. If you want to view your data in Excel, you can export in CSV or XML. When prompted, click "Open", and by default, your data will be opened in XLS format.

Visuascraper provides an Anonymous Proxy Service for a fee. (1 Anonymous Page = 3 Page Credits, 1 Anonymous Search Engine Page = 7 Page Credits)

You can configure Visuascraper to send an email upon: projects finishing, project errors, projects publishing, and projects unable to publish.

Automatically send data from the Visuascraper system to an FTP server or email address. This can be done on a predefined schedule or every time the project runs.

Within your Collections, you can refine your data to see or export only certain fields of data. For example, if you are gathering contact information from a web directory that includes name, address, phone, and email, but you only want to see or use the email field and nothing else, you can create a custom view to display only the emails, and then export them if you wish.

For an extra fee, you can download images (gif, tif, jpeg, etc.) or files (.doc, .xls, .pdf, etc.) from any webpage. Images and files are renamed according to your specifications, and then stored in a separate folder that can be exported or published via FTP. 1 Image Download = 2 Page Credits 1 File Download = 4 Page Credits 1 PDF Document = 4 Page Credits to download + 2 Page Credits per PDF page extracted.

Additional storage over allotted amount is $10 per 1 GB, billed monthly.

Professional account users can programmatically manage projects and data using the Visuascraper REST API. For more info, check out our

One time data delivery

 

  • 1 Time Only
  • 200.000 records
  • Email & Project support
  • Extract only Once
  • Unlimited Bandwith

Recurring Data Delivery

+ $50/month

  • Monthly
  • 200.00 records
  • Email & Project Support
  • Recurring Extraction
  • Unlimited Bandwith

Simple Software Extractor

$200+ your own unique software

  • For big data
  • $100 per extra website
  • Your own unique software
  • Unlimited Records
  • Run on your windows/our server
  • 6 month maintanance
  • Unlimited Pages

Advanced Software Extractor

$300+ your own unique software

  • For big data
  • $100 per extra website
  • Need to login with account
  • Need proxies
  • Need multithread/process
  • Complicated Ajax Response
  • Captcha Bypass
  • Extract text/phone number from images
  • Extract email/address/text/names from pdf

Web scraping service

You need to setup a project & outline your requirements. Our staff will then review the requirements and set up the crawler to collect the data for you. You can then view the live data stream of the data from within the project. After the crawl is complete, you can download the data as Spreadsheets or XML.

After the One-Time Data Collection is done; you can schedule the crawler to run repeatedly (daily/weekly/monthly). With each run, you get fresh data from the source delivered to you. The monthly fee starts at $50/per month.

You want to extract a single website ( e.g Yellow Pages ). You can hire us to code for you a unique software for you to run on your windows platform or on our server.

Advanced Software Extractor is used for crawling huge amount of data on multiple sites! This is used for big data only. It's different from the simple web scraping software is the limited websites up to 3 or more.

What is SIMPLE SOFTWARE EXTRACTOR?

What is simple software scraper and why do we need it?
It is designed for customers' requirements which are the action of pulling all the data from the three sites (FlipArt, Amazon, Snapdeal). Some people want to extract products like the above program. The different between "advanced" and "simple" is the requirements of authentication, proxies, multiple threads, time sucking task.

Why do we need this simple software?
The companies want to research the market or collect customers' reviews/feedback faster, easier, one-time cost only. Can be used for the whole big companies. Put all the data into a single spreadsheet for analysis. Various purposes.

Do you offer a service that run the program on your computer instead of mine?
Yes, i do. You will need to pay extra $50/month. The data will deliver by ftp/email/amazon s3 bucket/visualscraper's website

What kind of payment?
Paypal: busyneed@gmail.com

Some Use Cases

Yelp - Directory Listing for local businesses and Reviews
Yelp is an San Francisco, California based. It is a directory listing of many different company's address. Yelp is very popular is United State and is crow-sourced reviews about local businesses. Yelp has online reservation service like SeatMe and online food-delivery service Eat24. Businesses want to scrape yelp in order to pull information about their potential customers' valuable information, competitors information, customers' reviews. What are the top companies in their area or something to learn from other businesses.
  1. Scrape the yelp data according to business field ( eg. living music venues in all US or UK )
  2. Yelp data scraper requires multiple addresses
  3. Scrape yelp data requires slow speed requests
  4. 50,000 records requires 3-5 days

Amazon Product Scraper
Scraping amazon's product is required a nice and clean web browser to surf web like a real human behavior. Just like scraping yelp, amazon will requires you to enter captcha if it can detect suspected actions. Amazon data scraping requires multiple ip address as well. The more queries we need, the more ip address we must own.
  1. Enter keywords and click to scrape
  2. Scrape the list of products' detail information
  3. Schedule it to scrape 1 hour / day
  4. Send a fresh of JSON format data through ftp server

Website automation
You have an account and you want to automate all your actions on the web. It's really convenience to have such a software that do all the actions for you.
What the software does:
  1. Login with configured account
  2. Navigate through pages
  3. Search with the configured search terms provided
  4. Scrape list of information
  5. Navigate to each information's detail page
  6. Scrape the information's detail page
  7. Navigate to the next page ( pagination )

Extract data from linkedin
Businesses want to scrape other businesses' employees list and automate the way they want to do mannually.
What the software does:
  1. Enter search terms
  2. Click
  3. Extract the list of companies and employees
  4. Click and collect them
  5. Checking the notification for approval
  6. Click accept automatically
  7. Automatic send message with template to the user approved
  8. Send them message of birthday
  9. Greeting and annoucement
  10. Collect skype, phone, address on their profile
  11. Search on google for their facebook,google+ and collect about page
  12. If there is phone number, check the owner phone number for criminal records, public records, reserve phone, arrests records.
  13. Keep pushing the data to client MYSQL Database

Distributed Web Crawling System
Businesses want to develop a large scale of data by collect various type of data across multiple websites. However, websites are too complex and must be hand coded to scrape so we need to create each concurrency web scraping sofware nodes. In order to manage too many crawlers at the same time. The system may requires server side to receive and monitor data from multiple crawlers. Also, The programmers and the staffs required one software to manage or monitor all virtual private server, all crawlers and cloud based servers.
Such kind of businesses are listed like Startup to revolutionize the whole things. Such as food listing sites, travel listing sites, classified listing sites, product comparision sites, all in one news, job listing sites, and so on.
Just like visualscraper system but a little bit of different is everything you do is hidden on your server side. Which means that you don't need to make it beautiful but work only.
What the system have:
  1. Multiple crawler for each websites
  2. A web server
  3. MySQL Database will do
  4. The Hive - like a brain to control and give commands/jobs to crawlers to work.
  5. The Queen - like you, you need to monitor the crawlers, the hive, the PC
  6. Multiple PCs/VPS

Smart search for company' emails, phone number, address
You have a list of client company names and you want to search for their email address, website, phone numbers. This software will find the right website of the provided client company name and then crawl through the client's website to extract all email, phones, address, skype id, yahoo messenger, postal code, address.
What this software does:
  1. Enter provided keywords (clients' company name or search terms) into google search bar
  2. Scrape all the websites on google results
  3. Crawl through all the websites to extract key factors to determine the right company
  4. Extract the email addresses
  5. Extract the phone numbers
  6. Extract the address and postal code of the company
  7. Put them all into a nice CSV spreadsheet file

Extract email from pdf
the software scrape the websites' pdfs and extract all names and emails.
What this software does:
  1. Search on the targeted website
  2. Scrape all the search results including pagination
  3. Navigate into the each detail page and scrape the information
  4. Download the required PDF files and extract text from PDF
  5. Extract all the name, emails of the author
  6. Put them into a nice csv files

Extract product data from shopping cart - big data stores
This site is has approximately of 54,000 records and it uses angular js which is complicated javascript language. Also, it uses google re-captcha and apis to block unusual traffic. Gladly, i did it.
I have extracted whole websites' database, including the images.
Ajax Pages like this one is hardly scraped.
  1. Scrape all products of websites once first
  2. Keep scraping the products hourly for updated data
  3. Running on multiple Virtual Private Server ( VPS )
  4. IP Rotation
  5. Push data to one database


Web Image Grabber
Scraping image is never been easier. I can scrape images from websites or extract the text inside the images as well. With my web image grabber, you can scrape all image from any website.


Extract text from image - an OCR web scraping software
Scrape text in image requires OCR which is called "Optical character recognition". With OCR library, we can scrape phone numbers, address or text which is hidden inside the images.
  1. Scraper tools will scrape the data and images of website
  2. There will be a program that process all images and extract text from those images.
  3. Put them all into a nice CSV spreadsheet file


Yellowpages Web Scraping - DIRECTORY LISTING
The client was very happy with the scraping results. It helps the client to find out who is her competitors in her niche market. Specificically,i scrape all the categories and all the companies in the categories.
Over 300,000 records has been scraped by staff of VisualScraper. The client was happy and they used the database to contact all their potential customers. What the software does:
  1. Scrape the whole directory of yellow pages
  2. Scrape the search results of the provided company names
  3. Click and scrape the email address, phone number, address, website of the search resutls

MyWatch - Sweden Watches
Cykellagret - Bicycle websites
Pinkorblue - Products for kids
Dustinhome - Products for computer hardware
Obitsforlife
The client is also very happy with the scraping results and one console software. It helps the client to automatically update the data of this site and other sites automatically. This project simply extract the whole websites' records one time and then i automatically extract the new records everyday.Also, it extracted not only text but also images.

Skype me now or Leave an email

Customers are happy

web scraping of Images Sweden Watch' data
web scraping of Images Sweden Watch' data