
Freelance Data Collection Jobs
Looking for freelance data collection jobs and project work? Browse active opportunities on PeoplePerHour, or hire data scraping specialists through Toptal’s rigorously vetted talent network.
Marketplace Data Collection
We are looking for a freelancer who can quickly collect listings from several online marketplaces. Task: - Find 3,000 listings on the marketplaces listed below that meet our criteria. - Add all listings to an Excel spreadsheet. - You must be registered on the platforms to open contact details and include them in the Excel file. Marketplaces: - Osta.ee - Skelbiu.lt - eMAG - Bazar - Allegro - OLX Requirements: - Attention to detail - Ability to work quickly and accurately - Registered accounts on the platforms to access contact details Deadline: The work must be completed within 3 days. Payment: Payment is negotiable. Important: We need someone who can start working immediately.
10 days ago23 proposalsRemoteDesign & Organization of Waste Collection Data Using Star Schema
Design and implementation of a Data Warehouse using a Star Schema to analyze waste collection operations. The project focuses on transforming raw operational data into structured dimension and fact tables to support data-driven insights and decision-making.
5 days ago7 proposalsRemoteOnline Research Assistant for Simple Data Collection
Project Description We are looking for a freelancer to assist with a small online research project. The task involves finding publicly available information online and organizing it into a spreadsheet. Tasks may include: * Searching for business or website information online * Collecting publicly available details * Entering the information into a spreadsheet * Organizing the data in a clear and structured format Requirements: * Basic internet research skills * Familiarity with Google Sheets or Microsoft Excel * Good attention to detail * Ability to follow simple instructions Budget This is a small fixed-price project with a budget between $40 and $120 depending on experience and completion time. Additional Information The task is expected to take a few hours. Detailed instructions and an example spreadsheet will be provided after the project begins. Please submit a short proposal describing any experience you have with research or data entry.
11 days ago47 proposalsRemoteData engineer
We are seeking an experienced Data Engineer to help organize, clean, and structure complex real estate and regulatory compliance data across multiple sources. This role focuses on transforming inconsistent datasets related to leases, occupancy, tenants, and rent information into a reliable and scalable data foundation. The ideal candidate will review existing data, identify quality issues such as duplication and missing fields, and design standardized schemas and relationships. You will build transformation workflows to clean and normalize data from spreadsheets, databases, and system exports. In this role, you will create master datasets for properties, units, households, leases, and compliance tracking while implementing validation rules and exception reporting. You will also document data definitions, mapping logic, and business rules to support transparency and long-term maintainability, while collaborating with stakeholders to translate operational requirements into structured data models. Strong proficiency in SQL and Python is required, along with hands-on experience in ETL/ELT workflows and relational data modeling. Experience working with messy, Excel-heavy datasets and building data quality checks is essential, and familiarity with tools like dbt, Airflow, or cloud platforms such as Snowflake or BigQuery is highly preferred. Success in this role means delivering a clear, consistent source of truth for lease and occupancy data, reducing inconsistencies, and preparing the data environment for reporting, automation, and future product development.
a day ago11 proposalsRemoteopportunity
I need a pro to provide me lead
I need a freelancer to collect 22 valid UK email receipts in .eml format. Requirements: UK-based receipts (last 2 years) Correct .eml format Complete and accurate metadata Follow category guidelines (shared after hire) All personal data must be removed/anonymized Important: Only accurate and verifiable receipts will be accepted. Deadline: 24 hours after acceptance Before i accept the offer you will have to provide 3 lead Read carefully
18 hours ago7 proposalsRemoteI need a pro to provide me lead
I need a freelancer to collect 22 valid UK email receipts in .eml format. Requirements: UK-based receipts (last 2 years) Correct .eml format Complete and accurate metadata Follow category guidelines (shared after hire) All personal data must be removed/anonymized Important: Only accurate and verifiable receipts will be accepted. Deadline: 24 hours after acceptance Before i accept the offer you will have to provide 3 lead
a day ago8 proposalsRemoteIntegrate 10 APIs into 3 Different categories on Wordpress
Integrate 10 APIs of 3 different categories into my wordpress website without the data slowing down the website nor the data going into the wrong category. please confirm how much you will do the entire integration and please confirm if you will be using plugin or fully coding. Thank you for your interest in this project. Please, I need someone who can help integrate APIs. 1. The APIs, when integrated, should pull the data and store it on my web platform so that it would be easy for visitors to access without having to request it every time. 2. The extracted data would be displayed on a WordPress directorist plugin. if you are using code or a plugin, please let me know. But importantly, the data pulled should be able to display I. The title II. The Overview (The initial part of the job information) III. More Information (Continuation of the extracted information) IV. Link to the job post to apply on V. The Logo, if available, if not, a default from file should be loaded. VI. The only jobs to be pulled are remote jobs in some twenty-something categories. 3. The data pulled from the APIs, when directed to the originating website, should not require a login for visitors to use or apply nor lead to a collection of other jobs but the job post alone. What is your best rate for the 10 APIs. Here are some I would like you to start with: https://rapidapi.com/Pat92/api/jobs-api14 https://rapidapi.com/fantastic-jobs-fantastic-jobs-default/api/active-jobs-db Online learning portals Greenhouse and other remote job boards, please let me know which you have sucessfully integrated and can do on directorist plugin on wordpress website Can you also work with XML feeds, if yes, please let me know your cost, if you can create a bot scrapper, how much? How many days will you be needing to get this done Thanks.
2 days ago15 proposalsRemoteFreelance Data Entry Clerk
Project Description: I am seeking a detail-oriented Freelance Data Entry Clerk. The ideal candidate will accurately input, update, and manage data across spreadsheets, databases, and CRM systems to ensure records are complete, organised, and error-free. Key Responsibilities: Transferring information from physical records, PDFs, or audio files into spreadsheets, CRM systems, or databases. Updating existing records to ensure data is current and accurate. Meticulously reviewing data to identify and correct inaccuracies, missing information, or inconsistencies. Compiling and entering data from internal documents, reports, and provided source materials. Sorting and organising digital or physical documents for easy retrieval. This is a freelance hiring opportunity, not an offer for permanent employment or outsourcing projects. If you are organised, detail-oriented, and experienced in Freelance Data Entry Clerk, I look forward to hearing from you!
2 days ago39 proposalsRemoteopportunityurgent
Reformatting and cleaning data from an old CRM
I have several excel spreadsheets with excessive data which need cleaning and updating then putting into a workable format
6 days ago79 proposalsRemoteExpires in 24API Data Telegram Notification Bot
Hi! I’m looking for someone to build a data collection bot via APIs. AI PROPOSALS WILL BE IGNORED. API docs https://docs.mobula.io/rest-api-reference/introduction Here’s basically what it needs to do: Step 1: Query https://api.mobula.io/docs/static/index.html#/default/post_2_pulse every 1 min. I will provide query parameter such as { "views": [ { "name": "bonded true", "chainId": ["solana:solana"], "filters": { "pools": { "bonded": { "equals": true } } } } ] } I still need to add different views such as "volume_24h": { "gte": 10000 }. The above is an example. Step 2: Store the data: view name, token name, contract address, price, market cap, transaction count, volume, liquidity, fees paid, bundlers, exchange, date, time (I might need a few more data from that query, TBD). Exclude duplicates based on the contract address (NOT token name). If there is an API or bot error, send a Telegram message via bot. Step 3: Every day get OHLCV for two days before the day (from 00:00 to 23:59 UTC, for the 24 hours after the tokens were found). For example, if today is 12/02, get the data for the TokenA found at 10/02 15:18, until 11/02 15:18. This is the endpoint for that https://api.mobula.io/docs/static/index.html#/default/get_2_token_ohlcv_history Step 4: Calculate Max Increase, Max Decrease Before Max Increase, Max Decrease Without Max Increase for within 5M (minutes), 10M, 15M, 30M, 1H (hours), 2H, 3H, 6H, 12H, 18H, 24H, from the Date & Time the tokens above were first found. Use the “H” price from the above OHLCV query and the price you stored from the first query. For example, if the first price was 0.1, and the H price within 5 minutes after that was 0.12, the Max Increase should be 20(%). Note that I said “within”, not “at” 5 minutes, so any time within 5 minutes for example. If there is an API or bot error, send a Telegram message. Step 5: Send report via Telegram Bot in CSV format with all of the above in the following column order; View Name, Token Name, Contract Address, Market Cap, Liquidity, Volume, Transactions, Fees Paid, Bundlers, Max Increase for each timeframe (from 5M to 24H), Max Decrease Before Max Increase for each timeframe, Max Decrease Without Max Increase columns for each timeframe, Date (token found), Time (token found), Data Errors (if any). Do not write $ or % in the fields, just numbers. No decimals. Dates should be formatted like 12-Feb-26. There should be a daily report sent 2 days after at 1pm UTC and a weekly report for Monday to Sunday sent each following Tuesday at 1pm UTC. I will give you credentials for the bot via BotFather. Requirements - Should work well on a Windows VPS with 2vCPU and 4GB RAM (if that’s not possible, please let me know before starting). - I will give you credentials to access the VPS once the bot is ready, please install the software on there - 95% uptime - I’m looking for someone to keep this up long term. $15/mo for maintaining it (same contract) + if the API platform makes any changes that need the bot to be updated or if I want new features these can be billed in addition. - Deadline: 5 days from offer accepted - Code commented - Source code and changes in GitHub - README.md - Good documentation on GitHub including Installation instructions, prerequisites, changes, etc. - Program should be easy to re-boot if needed (just by opening the program) - I should be able to change the API key easily in code if needed - I should be able to whitelist my Telegram account in the code so that others can’t use it - Please let me know the programming languages you would use for this. If you are not an AI, write “Letter” at the top of your proposal. Looking forward to your response :)
4 days ago28 proposalsRemoteExcel formulas for data cleaning and analyusis
Seeking an experienced Google Sheets/Excel specialist to craft robust formulas for automatic data extraction and analysis from an operational spreadsheet. Deliverables: formulas for date-range analysis, branch/location totals, trend and key-metric identification, concise summary outputs, and recommendations on optimal functions (FILTER, SUMIFS, QUERY, INDEX/MATCH, etc.). Data layout: branches by rows, dates by columns, numeric values at branch/date intersections. Ability to review the sheet and propose an elegant formula architecture or collaborate live is required. Describe relevant prior spreadsheet projects.
4 days ago19 proposalsRemoteopportunity
UK Crypto Tax reconciliation & data analysis
Description: We are a UK-based accountancy firm specialising in Crypto tax, and we are looking for an experienced Crypto Tax Data Analyst to support ongoing client work. This role is focused heavily on data analysis rather than traditional accounting. Scope of work includes: - Reviewing wallet and exchange data (CSV/API exports) - Line-by-line transaction analysis - Identifying and categorising taxable events under UK (HMRC) rules - Reconciling discrepancies across wallets, exchanges, and DeFi activity - Cleaning and structuring datasets for tax reporting - Supporting preparation of outputs for final tax review Typical clients include: - High-volume traders - DeFi users (staking, liquidity pools, bridging, etc.) - NFT traders - Individuals with complex multi-wallet activity Requirements: - Strong understanding of UK Crypto tax treatment (HMRC guidance essential) - Proven experience using tools such as Koinly, Recap, CoinTracking or similar - Ability to handle large datasets accurately and efficiently - Strong analytical mindset and attention to detail - Experience identifying errors, duplicates, missing cost basis, and incorrect classifications Nice to have: - Experience working with UK accountancy firms - Familiarity with DeFi protocols and on-chain activity - Basic Excel / data manipulation skills Engagement: - Ongoing work available for the right candidate - Initially project-based, with potential for long-term collaboration Trial Task (Important): - Shortlisted candidates will be asked to complete a paid trial task. This will involve reviewing a sample dataset and: - Identifying key issues (e.g. missing cost basis, incorrect classifications, duplicates) - Providing a brief explanation of how you would resolve them - Demonstrating your approach to structuring clean, usable data This is a critical part of our selection process to ensure candidates can handle real-world Crypto data complexity. To apply: Please include: Examples of similar Crypto tax work you’ve completed Which software/tools you’ve used Your approach to handling messy or incomplete datasets
2 days ago5 proposalsRemoteData Scraping
I am seeking an adept data scraper to extract comprehensive information from a specified Shopify blog and compile it into a structured spreadsheet for seamless importation into a WordPress platform. The required data includes the original blog URL, publication date, URL link to the article's image, article title, and the full content of each article. The task involves approximately 174 articles, necessitating meticulous attention to detail and accuracy. Your expertise in data extraction and formatting will be invaluable for this project. the url is: https://kingsnqueens.com/blogs/news. Thank you for your interest!
20 days ago64 proposalsRemoteI need help with my shopify store
on my shopify store my collections are automatically appearing as 'featured' evan though i have changed which order i want them to appear in each individually collection. i want to be able to choose which order products appear
19 days ago48 proposalsRemoteUK Business DATA Supplier -
I am looking for a business data supplier. Data will be independant businesses - owners name, business name, address, email, whatsapp , post code price per 1000, 10000 & 100000 + turn around time. If you can scrape any other information for direct marketing, please let us know, including LinkedIn & plastic card companies Regards Proactiv
12 days ago25 proposalsRemoteopportunity
Setup a connection between Hoowla and Powerbi
Integrate Hoowla with Power BI to enable full data extraction and reporting. Use the provided Hoowla API documentation to authenticate, retrieve all available endpoints and datasets, and implement efficient data ingestion into Power BI. Deliver a reusable, documented solution using Power Query (M), REST connectors, or a custom data connector as appropriate. Ensure incremental refresh capability, error handling, data transformation, and clear mapping of Hoowla fields to report-ready tables. Provide deployment and brief usage instructions.
4 days ago17 proposalsRemoteProperty Research Assistant (UK Planning Data – Land & Barns)
Project Description: This role involves research only. No sales, negotiation, or brokerage activity is required. Research UK property listings to identify land, barns, or smallholdings with granted and active planning permission for holiday, glamping, or agritourism use. Tasks: Search Rightmove, Zoopla, Plotfinder Verify planning via UK council portals Record accurate data in a spreadsheet Output (per entry): Listing link Price Planning reference + confirmed active status Planning use Agent contact Brief notes Requirements: UK planning portal familiarity preferred Only include properties with confirmed planning permission Volume: 10–30 entries initially (ongoing possible) Payment: Per valid entry or fixed batch (agreed) Start: Immediate
2 days ago7 proposalsRemoteopportunity
Host deploy and test and existing app
This project aims to deploy, on a dedicated VPS server, the complete environment required to run the WHIMO solution developed by the European Forest Institute. WHIMO is an Android application designed to collect, store, and transfer geolocation data to support compliance with the EU’s EUDR regulation. To operate properly, the mobile application relies on a stable, secure, and publicly accessible backend environment. The mission is to deploy, configure, and secure all backend services of WHIMO using the official GitHub repositories, and to ensure that the API is fully operational so the Android application can connect to it. This includes preparing the VPS, installing all dependencies (Docker, PostgreSQL, Redis, Gunicorn, HTTPS reverse proxy), setting up Firebase credentials, configuring the network and security layers, and delivering complete technical documentation. The final objective is to deliver a stable, secure, and operationnal hosted application on the new server environment, ready for use , while respecting best practices in development, security, and operations. Repositories Android Application: https://github.com/EuropeanForestInstitute/whimo-android Backend (Django + REST API): https://github.com/EuropeanForestInstitute/whimo-backend Infrastructure (Optional – IaC): https://github.com/EuropeanForestInstitute/whimo-infra To be provide : VPS credential
6 days ago29 proposalsRemoteRepairs and Monitoring system
I require a designer to develop a simple, user-friendly Repairs and Monitoring system to manage walkabouts, repairs and health & safety issues across five accommodation schemes. Existing Excel spreadsheets define required data; the task is to integrate these into a cohesive solution—link sheets, create dropdowns, automate data population, implement tracking and status updates, and produce clear reports. The system must accommodate staff with limited computer skills, be reliable, and easy to maintain.
2 days ago26 proposalsRemoteFreelance Data Entry Specialist
Project Overview: Seeking an experienced freelance specialist for a 6–8 week project to prepare Excel data and upload 700 plus product kit/variant codes into our ERP system. Responsibilities: • Structure and format Excel sheets for 700 plus codes, ensuring 100% accuracy. • Bulk upload codes to ERP and manually complete fields not supported by bulk upload (categories, warehouse, images, kit functions). • Assign photography to items without kit codes and configure Bill of Materials (BOM) with correct variant quantities. • Test and validate all codes, verifying descriptions, pricing, images, and kit functionality. Requirements: • Strong Excel skills (data structuring, validation, formatting). • Experience with stock databases or ERP systems (Orderwise preferred). • High attention to detail, accuracy, and ability to work independently. Deliverables: • Verified, upload-ready Excel sheets. • 700 plus completed and tested kit/variant codes with accurate pricing and imagery. Project Details: • Duration: 4–6 weeks • On-site project with attendance at our Balham Offices, SW12 9DJ, required every working day. • Monday to Friday 9am-6pm – one hour lunch break • £18 per hour
24 days ago25 proposalsRemote