
Scrape Website & Auto Populate Google Sheet
- or -
Post a project like this20
£400(approx. $534)
- Posted:
- Proposals: 29
- Remote
- #4451972
- OPPORTUNITY
- Awarded
⭐ TOP CERT Graphic Designer ⭐| Expert 2D/3D Render | Video Animator | Web Developer |Logo Designer |Graphic Animations | Video Editor ||Illustration.

Digital Web & AI Automation Agency | Expert in WordPress, Shopify & Custom Development




⭐ Graphic Designer ⭐| Expert 2D/3D Render | Video Animator | Web Developer |Logo Designer |Graphic Animations | Video Editor |Illustration | Content Writer | Data Scraper.

PPH's TOP Notch Website & Mobile App Developer & Designer(9+ yrs) ✔ Wordpress ✔ Shopify ✔ OpenCart ✔ Laravel ✔ PHP ✔ React Native ✔ Android ✔ iOS ✔ HTML/CSS✔Javascript/jQuery✔Responsive Design✔ASP.net




♛ PPH No. #1 ♛ 21Years of Experience in Web Development , Web Designing, Magento , Shopify, WordPress , API Integration, Full-Stack Ruby on Rails Developer,AngularJS / Node.js


94873242274804274675212609357116888481154996455983656280482824956504159111700055623148
Description
Experience Level: Expert
Estimated project duration: less than 1 week
We are looking for somebody to undertake a project for us.
We would like to scrape all products from https://www.jesuk.com/
We have the login details for you to use, and these will be provided separately if required.
These are the column headers we need to populate:
Category 1
Category 2
Product Name
Product SKU
Product Description
Product Price
Product URL
Product Image URL
Product Brand
Breadcrumb
In Stock?
Here is the Google sheet to populate: https://docs.google.com/spreadsheets/d/1D42QEp4U8CEllpl7U-q8U0nCEW8GEQtflMCeGEexDus/edit?usp=sharing
We require this spreadsheet to re-crawl daily (overnight) and update any product changes
We also require an email to be sent to us with an Excel sheet attached displaying which products have changed and what the change was.
We also need each product to have its image scraped and the image saved in this folder: https://drive.google.com/drive/folders/1vnUsLrll35_n-iR5GQlqLaL8hTfwJfNs?usp=sharing
Each image should be named by SKU (as exported from the crawl)
There are around 50k products to crawl.
Once the project is working correctly, we will need the source code so that we can run this from our own server.
We would like this completed by the 4th December
If you have any further questions, please ask.
We would like to scrape all products from https://www.jesuk.com/
We have the login details for you to use, and these will be provided separately if required.
These are the column headers we need to populate:
Category 1
Category 2
Product Name
Product SKU
Product Description
Product Price
Product URL
Product Image URL
Product Brand
Breadcrumb
In Stock?
Here is the Google sheet to populate: https://docs.google.com/spreadsheets/d/1D42QEp4U8CEllpl7U-q8U0nCEW8GEQtflMCeGEexDus/edit?usp=sharing
We require this spreadsheet to re-crawl daily (overnight) and update any product changes
We also require an email to be sent to us with an Excel sheet attached displaying which products have changed and what the change was.
We also need each product to have its image scraped and the image saved in this folder: https://drive.google.com/drive/folders/1vnUsLrll35_n-iR5GQlqLaL8hTfwJfNs?usp=sharing
Each image should be named by SKU (as exported from the crawl)
There are around 50k products to crawl.
Once the project is working correctly, we will need the source code so that we can run this from our own server.
We would like this completed by the 4th December
If you have any further questions, please ask.
Steve H.
100% (83)Projects Completed
64
Freelancers worked with
57
Projects awarded
42%
Last project
3 Jun 2025
United Kingdom
New Proposal
Login to your account and send a proposal now to get this project.
Log inClarification Board Ask a Question
-
There are no clarification messages.
We collect cookies to enable the proper functioning and security of our website, and to enhance your experience. By clicking on 'Accept All Cookies', you consent to the use of these cookies. You can change your 'Cookies Settings' at any time. For more information, please read ourCookie Policy
Cookie Settings
Accept All Cookies