Freelance Data Scraping Jobs
Looking for freelance Data Scraping jobs and project work? PeoplePerHour has you covered.
Data Scraping from LinkedIn search results
I need a data entry personnel to search out and scrape 200 relevant posts on LinkedIn using the keywords provided, filter the search to the United States, and add data to an Excel sheet. Data needed include: 1. Profile name 2. LinkedIn profile link 3. Post link Recorded a loom to help guide the process: https://www.loom.com/share/1191b16f8a124767a3c1d1f68fb6616b?sid=b8d5833f-3a3d-4e44-bd82-13170a35e165 Document containing keywords to search: https://docs.google.com/document/d/19pXU0P6IvUVO_PrVSre36MmOBQ-CBFLt7oIq3x7FmOM/edit?usp=sharing
7 days ago69 proposalsRemoteWeb Scraping
Hi, I want to scrape data from justdial.com. It is showing only 100 records per 1 query but I need to bypass the security and scrape 1k-2k records instead of 100. Can anyone help? FYI, It is a business directory like yelp, yellowpages etc.
21 hours ago18 proposalsRemoteSelenium-based data scrape of targeted LI profiles
The creation of a SQL database by scraping data from ~345,000 records from LinkedIn. - Selenium scripts to capture multiple fields from LI profiles (including Summary, Role Descriptions, Dates in Roles) - The storing of the data in an SQL database - Detailed technical documentation about each script
17 hours ago26 proposalsRemoteSeeking Web Scraper for Adjustments to Existing Real Est. Script
I am a Python enthusiast in need of a freelance web scraper for a personal project to make adjustments to an existing Python script that extracts data from Idealista.com. The goal is to refine the script to handle the website’s anti-scraping measures effectively. Project Overview: Script Adjustment: Update and optimize an existing WORKING script to improve data extraction reliability and efficiency. But issues with anti scrapping defence from the website. Anti-Scraping Strategy: Implement simple and cost-effective strategies to overcome scraping blocks. Requirements: Proven experience in modifying existing web scraping scripts. Familiarity with Python and scraping libraries like BeautifulSoup, or others. Understanding of techniques to bypass or handle anti-scraping technologies. Ability to work within a tight budget and deliver quick solutions. Deliverables: An updated, fully functional scraping script. Brief documentation on the changes made and how to handle potential future issues. Project Duration: Quick turnaround expected. Budget: Since this is an adjustment to an existing script, I am looking for a cost-effective solution. Please provide your fixed-rate offer for this adjustment. I can provide my existing code, so you can evaluate changes needed. If you are interested, please respond with: Any initial thoughts or specific techniques you think could be effective. I am looking to start as soon as possible and appreciate your swift response. Thank you!
2 days ago19 proposalsRemoteopportunity
Data Extraction and Input
Hi all, I am looking for help to pull some price modelling data from an website and place it into an excel file that replicates how it is shown on the web page. There is a lot of data and I do not think it can be exported, it needs to be done manually. I do not think it is viable to scrape the data. Once we have pulled the data we then need to build charts / data filtering to display the data correctly. My feeling is that the freelancer that does the job will need to be an expert an MS excel. It would be easier to do a Zoom call to show the freelancer the system and explain fully what is required so the price below is a placeholder. Thanks!
5 days ago66 proposalsRemoteWeb scraping
I am looking at Web scraping software, based on input; to produce a spreadsheet of URL, email address and phone numbers extracted from the contact us pages,using legitimate methods. A URL scrapper will also need to be built into the application. To be able to facilitate the above process.
8 days ago30 proposalsRemoteData scrape consolation
An expert consultation is sought regarding modern artificial intelligence techniques for efficient data scraping. For over nine years, traditional scraping methods using common websites have been employed to gather information from Google Maps and LinkedIn. However, newer AI technologies promise scraping capabilities at significantly reduced costs compared to established solutions. Rather than using code or tools directly, the consultant would identify and briefly describe the ten best scraping platforms that leverage artificial intelligence to deliver high-quality data rivaling industry leaders, but at optimized pricing. In return for a10 minute consultation providing these top scraping website recommendations, a £10 payment will be provided. The aim is to uncover the highest data value possible through cutting-edge AI scraping alternatives.
a month ago28 proposalsRemoteopportunity
I need News Bot in C or C++, Phyton Language
Hello, I need a News Bots written in C, C++ or Phyton language that has features. I need 30 News bots that will scrape Latest News some Big News Sites's Category Pages or Rss Pages, Translate scraped data from source site language to target site language and post the news articles with image and features like News Title , Tags ect in Wordpress Sites with same or different categories via crons. Bot's main system will be same for all 30 bots, you can create one bot scraping from 30 different sites or you can create 30 different bots. I will give extra details in workstream But I will refuse freelancers who give low price in proposal and ask extra money later. I have fixed budget. You can pass this task if this money doesnt make you happy.
7 days ago23 proposalsRemotepre-funded
I would like some pages scraped which are behind reCAPTCHA
I have a list of 50k webpages of the same design on the same website, each of which includes a piece of text I want to be scraped and exported. The problem is that the page is protected by "I am not a robot" reCAPTCHA. So I would like someone to take my list of 50k URLs and return the the missing line of text for each page in a csv or google sheet. Note that I do not need to code you use - just the end result. Here is a redirect to an example of the pages I need scraped https://www.temporary-url.com/9C6C0 Please check the page first and explain in your answer what the site and the info I need are (this is to make sure people have read the ad fully and checked out the task. Without this, your reply will be marked as spam). Please give me the cost and time it would take to provide them all.
15 days ago29 proposalsRemoteSeeking No-Code Architect for Automation Project
We are seeking a talented no-code architect to assist us with a project focused on web scraping, content filtering, and process automation. The ideal candidate will have expertise in identifying and configuring the best no-code tools and platforms for our requirements, enabling us to automate processes and extract valuable insights without traditional coding. Responsibilities: Evaluate project requirements and objectives for web scraping, content filtering, and automation. Research and assess available no-code platforms, tools, and integrations suitable for the project's needs. Recommend and select the most suitable no-code solutions for web scraping, data filtering, and automation tasks. Configure and set up integrations between selected no-code tools and platforms to achieve seamless automation. Provide guidance and best practices for leveraging no-code solutions effectively to meet project goals. Requirements: Proven experience as a no-code architect or consultant, with a strong track record of implementing successful no-code solutions. Comprehensive knowledge of various no-code platforms, tools, and integrations, particularly those suitable for web scraping and automation. Familiarity with web scraping techniques, data extraction methods, and content filtering processes. Strong problem-solving skills and the ability to assess project requirements to recommend suitable no-code solutions. Excellent communication skills and the ability to collaborate effectively with project stakeholders. If you're passionate about harnessing the power of no-code tools to automate processes and drive efficiency, we'd love to hear from you! Please provide details of your relevant experience and examples of previous projects involving no-code architecture and automation. Budget: Flexible, depending on experience and scope of work. Duration: Short-term project with potential for ongoing collaboration. Deadline for Applications: [Insert deadline or state "Open until filled"] We look forward to reviewing your proposals and discussing how we can work together to achieve our project goals. Thank you for considering this opportunity!
6 days ago13 proposalsRemoteI need text scraped form 8000 URLs
I have an Excel with 8000 URLs in column A In Column B I need the text scraped for each of those websites (in plain text)
5 days ago35 proposalsRemoteXero - data developer
Hello, our company is looking for a Xero Accredited Developer who could help us connect our custom Access database (called Switchboard) to Xero so we can use it for one click invoicing. We currently have it set up with Sage but we are in a process of moving our accounting to Xero. Please get in touch is you could help. Many thanks, Jana Auton
2 hours ago2 proposalsRemoteSave HTML Page Python Script
Hi All I have a python code that work but needs abit of editing. Is this something you can help with today and need to scrape the site today Thanks Phil
9 days ago15 proposalsRemoteRecover my data.
Hello, my google drive has been hacked and the password changed. This is apparently quite common, but google provide no way or support to fix the issue. I have an old list of contacts, but without email and mobile phone numbers. Any proposals on how to recover the missing data please?
9 days ago18 proposalsRemoteCreate chatgpt bot scraping content from PC
Seeking programmer to create ChatGPT bot to self train on my data in my PC in word, pdf, excel and ppt, and also outlook. And use that to generate responses on chatgpt 4. Every day new files will be added to the folder. The bot needs to retrain whenever new file is added or when manually prompted. Must have experience with ChatGPT, content scraping, etc. Urgent start, 2 day turnaround needed.
17 days ago10 proposalsRemoteI need Linkedin profiles of the owners of a Google map scraping
I need you to do the following: 1. Scrape all the 'IT-consultancy' companies in The Netherlands from Google Maps including their details and website 2. Provide all the LinkedIn profile URLs of these companies 3. Scrape the # of employees of these companies (as listed in Linkedin) 4. Provide the LinkedIn profiles of the owner(s) of these companies Given the sheer size, this needs to be done automatically and not manually. Budget TBD.
15 days ago29 proposalsRemotePython Developer - Scraping Expert
Hi All I have a python code that work but needs abit of editing. Is this something you can help with today? Thanks Phil
22 days ago31 proposalsRemoteI need B2C data for a UK area:
Sussex, population; 250,000. I need consumer/B2C email addresses for this area towards that figure - needs to be homeowners and I just need the name, address, telephone and email addresses nothing else. I will literally accept the cheapest bid...
8 days ago24 proposalsRemoteData enricher needed to find up-to-date decision-maker details
Data enrichers needed to search online and find up-to-date decision-maker details for our contact lists. You will checking data points for suitability and search for decision-maker names, emails and phone numbers and adding them to our existing data lists. You will be checking and completing: Is the lead within our target audience Making sure the business is still open Finding general email Finding general phone number Entering Facebook profile link Entering instagram profile link Finding decision maker details You will be given full instructions and video guide to follow. Once the information is gathered, we need it to be added into a google sheet, which then we use for our outreach. To start we need 500 leads enriched a week and we are happy to pay $125 for this amount. You will be paid $500 a month for 2000 leads per month.Ongoing long-term work. 3 months of work guaranteed. If you do a good job we will extend it to 12 months. We will give bonuses for consistently good jobs. If you’re good, I will pay you more and give you even more work if you can handle it :) You will be spending time gathering the decision makers details and information however you see fit. Information can be found by checking new articles, companies house, linkedin, recruitment websites and company websites. We can also provide gmail accounts do you can access free trials on tools like Hunter and Lusha. We will be requiring 5 leads enriched as a free trial so we can see the quality of your work and your commitment to the role, but successful applicants can be confident they will have long-term consistent paid work. Open the sheet, make a copy then fill it out with your trial work and send it back to me. DO NOT request to edit. Make a copy Sheet here: https://docs.google.com/spreadsheets/d/1mObaIKSCFGf7AC8uVFi1FEP4-h0GovPj03yRg_pBjbc/edit?usp=sharing Please look for decision makers with the following job titles, and look to gather Name, Email. Job title: Founder, CEO, Managing Director, Director, Co-founder, Owner, Co-owner, General Manager, Operations Manager, Marketing Manager Prioritise Founders, CEO’s, owners and Director, before moving on to try those in manager positions. You should try and provide at least 1 decision maker for each lead, but more if possible. Each Lead should take 4-5 minutes. We will regularly be checking the quality of your work and the contact details you have provided. You will be monitored on the % of businesses you find decision-maker contact details for, and the % of valid details (bounce/delivery rates). You will be required to provide a report each week outlining how many leads have been enriched. The % that have numbers found, the percentage that have emails found, and the % that are failed.
15 minutes ago5 proposalsRemoteGoogle BigQuery specialist to handle data migration
Looking for a skilled BigQuery specialist to facilitate the migration of analytical data from Google Analytics into BigQuery. Namely, this involves the transfer of lifetime Google Analytics 3 data as well as the backfilling and ongoing synchronisation of Google Analytics 4 information. The selected candidate will configure BigQuery and design optimised storage schemas to preserve all pertinent metrics and dimensions from GA3 while ensuring comprehensive historical datasets from GA4. Automated scripting must also be set up to upload new GA4 data on a daily basis and implement retention policies addressing privacy regulations. Data integrity and performance are top priorities. You should demonstrate extensive experience architecting similar analytics solutions on BigQuery. The ability to future-proof the platform through an elastic data architecture and support ongoing needs as they evolve is indispensable. Please provide examples of past projects, rates, and anticipated timelines.
an hour ago5 proposalsRemote