
AI Data Services Projects
Looking for freelance AI Data Services jobs and project work? PeoplePerHour has you covered.
I need an expert on scrapping with appollo.io
Hi i need an expert to help us to scrap some datas with appollo.io My need is to scrap and extract some contacts in france based on Human ressources function
2 days ago24 proposalsRemoteAPI connect for event website
Create script for event API service website
3 days ago32 proposalsRemoteEveryday Text Photography Project - US (English Language)
Experience Level: Entry Estimated Project Duration: 1–2 weeks We are inviting freelancers based in US to participate in a localized photography project focused on capturing everyday text from real-world environments. The goal is to collect diverse images of printed or written text found in daily life — such as storefronts, flyers, receipts, menus, product labels, and public notices. All photos must follow simple photography and categorization guidelines that will be shared after selection. This is a flexible, remote photography task, open to multiple freelancers across US Scope of Work: - Capture clear photos of everyday items with visible text (e.g., menus, receipts, storefronts, signs, flyers, etc.) - Take 3 photos per object — Center, Left Angle, and Right Angle - Ensure good lighting and image variety (avoid duplicates) - At least 75% of the text must be in English, and up to 25% can be in English - Organize your photos by category as per the provided instructions - Follow the naming and documentation guidelines shared after selection Requirements: - Must be based in US - Native or fluent in English - Smartphone or camera with good photo quality - Ability to follow simple documentation guidelines - Reliable internet connection for file submission Deliverables: - Minimum of XX image sets (each set = 3 photos of one object) - All images must be clear, readable, and correctly categorized - Upload completed batches to the shared workspace as instructed Compensation: - Payment is based on the number of approved image sets - Only valid, non-duplicate, and high-quality images will be accepted - Payment will be released within 15–20 days after final approval
15 days ago2 proposalsRemote
Past Projects
Scalping contacts from O License records
I need transport and fleet manager personal emails from the internet, focussing on O License data base. O License database process the name and company , from here you can use companies house records and company websites to obtain email domains, names, contact numbers, addresses, social media contacts, and personal email addresses to build an excel sheet. Our target is fleet owners with 3 or more LCV’s or commercial vehicles within London and the southeast. Please send a sample of 5 lines in the excel sheet in you response
Data scraping for influencers (automation)
Hello I’m looking for data for a specific set of influencers Specifically for influencers with 75K+ followers on Instagram or 50k subs on YouTube With an audience over 40 The influencer will be speaking on topics like financial planning, many wellbeing categories. Pls tell me how you would do this and how much you’ll charge for the data Also can we verify the engagement rate? Say per 1,000 emails and verify the emails? A Python script using the YouTube Data API v3 (free, official) and Phantombuster (Instagram) to automatically search, filter, and export influencer data to a Google Sheet or CSV. YouTube (API — Free & Clean) Use: YouTube Data API v3 Script logic: 1. Loop through keyword list 2. Search type=channel for each keyword 3. Pull channel details: name, URL, subscriber count, description, email (regex parse from description) 4. Filter: subscribers between 50,000 and 500,000 5. Export all results to CSV Keywords to loop: retirement planning lifestyle retirement vlog over 60 grandparenting tips grandma empty nest life over 50 pickleball over 50 senior fitness over 60 menopause health women over 50 midlife women wellness caregiver aging parents RV retirement full time gardening over 50 vegetable men's health over 50 medicare senior health Christian women over 50 faith retirement lifestyle active aging senior Output columns: Channel Name | URL | Subscribers | Email | Description snippet | Keyword searched Instagram (Phantombuster) Use: Phantombuster Instagram Hashtag Scraper + Profile Scraper (client will provide API key and session cookie) Script/phantom logic: 1. Run Instagram Hashtag Post Scraper on each hashtag below 2. Pull 500 posts per hashtag → extract unique profile URLs 3. Feed profile URLs into Instagram Profile Scraper 4. Extract: username, follower count, bio text, email from bio, website URL 5. Filter: followers 50,000–500,000 6. Export to CSV Hashtags to scrape: #retirementplanning #retirementlife #retireearly #grandparents #grandmalife #nanalife #grandpalife #emptynesters #lifeafterkids #over50life #pickleballlife #activeaging #seniorfitness #womenover50 #menopause #midlifewomen #caregiverlife #agingparents #seniorcare #rvlife #fulltimerv #retirementtravel #vegetablegarden #gardeningover50 #growyourown #menshealth #healthyaging #over50fitness #medicare #seniorliving #agingwell #christianwomen #faithoverfear #christianliving Output Format Single CSV, one row per creator: Platform | Handle/Channel | URL | Followers | Email | Bio/Description | Category | Source Keyword Filtering Rules (build into script) ∙ Followers: 50,000–500,000 only ∙ Skip accounts with 0 posts or last post over 60 days ∙ Skip accounts where bio contains: “under 18”, “teen”, “student”, “college” ∙ Flag rows with no email (for Apollo enrichment pass) ∙ Deduplicate on URL before export Deliverable
AI receptionist and connected to Google Sheets
AI Phone System – One Page Summary Objective Reduce call volume, automate routine enquiries, improve response times, and ensure no missed customer enquiries. We are a company that offers activity training and experience vouchers. System Overview Customer calls → AI Receptionist → AI identifies call type → Transfers to staff OR handles automatically (voucher, weather, staff availability) → Logs enquiry → Notifies team. Call Handling Learning enquiries: all phones Existing customers: reception + operations Training queries: reception + training + operations + management Accounts: operations Voucher calls: AI handles Flight status: AI checks dashboard Required Software Vonage – phone system (existing) Synthflow – AI receptionist Twilio – SMS messaging Make.com – automation Google Sheets – operations dashboard Google Forms – booking form Operations Dashboard Flights Today | Staff Status | Instructor Status | AI Controls | Enquiries Log Automation Features Voucher booking automation Flight status responses (GO / DELAY / CANCEL / CHECK LATER) Staff availability responses Missed call capture Implementation Steps 1. Create dashboard 2. Set up AI system 3. Add SMS integration 4. Create booking form 5. Connect automation 6. Test system 7. Forward main number to AI 8. Install reception dashboard
Scraping of Website database information
Require a straightforward web-scraping project to extract publicly available database records from a website. Gather complete contact details including names, email addresses, phone numbers and physical addresses. Deliver clean, deduplicated structured data (CSV or JSON) with clear field mapping. Ensure respectful, lawful scraping practices, include brief documentation of methods, sample code and any dependencies. Bid with estimated timeline and fixed price.
Land use land cover map and morphological parameters map, India
Land use land cover map from 2006 to 2024 in ArcGIS. Also prepare morphological parameters map (20 numbers) for a region in India. Morphological parameters like slope, height above nearest drainage, drainage density, TWI, soil, distance to river, distancwe to road, CN grid, etc. For land use land cover map for 20 years, 2006 to 2024. Fees is 120 US dollar, Time 10 days.
UK Business DATA Supplier -
I am looking for a business data supplier. Data will be independant businesses - owners name, business name, address, email, whatsapp , post code price per 1000, 10000 & 100000 + turn around time. If you can scrape any other information for direct marketing, please let us know, including LinkedIn & plastic card companies Regards Proactiv
Need help analyse a data set and create results in dashboard
We are seeking an adept data analyst to meticulously evaluate a dataset in Excel and craft a dynamic dashboard that vividly presents key performance indicators (KPIs) using Power BI. The dashboard should be designed to seamlessly incorporate additional data while ensuring optimal functionality. The project is expected to take approximately three hours for a qualified expert. A collaborative session via Zoom or a similar platform will be essential to discuss project specifics and expectations. Proficiency in Excel and dashboard creation is imperative for this endeavor.
Need to analyse data in excel and create dashboard
We seek a proficient data analyst to evaluate data in Excel and develop a dynamic dashboard that effectively showcases key performance indicators (KPIs). The dashboard must be designed for easy integration of additional data while maintaining functionality. The anticipated completion time for this project is approximately three hours for a skilled professional. A collaborative session via Zoom or a similar platform will be required to discuss project specifics and expectations. Expertise in Excel and dashboard creation is essential.
opportunity
I need to obtain LinkedIn profiles from company data
I am looking for someone to provide me the LinkedIn profile URL's for a list of data I can supply either daily or weekly, developed as a logic-driven engine rather than a basic scraper. The system would match first name, last name, and registered location, then apply probability modelling using year of birth, likely education timeline, role alignment, and career consistency. Instead of returning raw results, the engine would score potential matches based on confidence percentage, providing ranked outputs suitable for campaign targeting.
UK Part-time Recruitment | WiFi Field Data Collector
UK Part-time Recruitment | WiFi Field Data Collector, Earn Extra Money Easily! Core Highlights • Transparent Salary: 3gbp per asset, more work more gain, no hidden deductions • Flexible Time: No office hours required, take orders and arrange your time freely in spare moments • Ultra-low Threshold: No professional experience needed, easy to master for beginners, with bilingual (Chinese & English) operation manual • UK-wide Coverage: No specific city restrictions, available anywhere in the UK Must-Meet Requirements (All Required) 1. Location Requirement: Must be located in the UK and able to complete on-site data collection tasks locally 2. Device Requirement: • Android smartphone (Android 6.0 or above; Android 9.0 is not recommended as it tends to freeze) • Minimum Memory Requirement: 8+8G (RAM + Storage Memory) • iOS devices are not supported temporarily, please do not apply Key Responsibilities 1. Use the designated APP (WiFi Field Collection APP V1.2) to complete WiFi signal collection at designated locations in accordance with operational standards 2. Complete basic APP operations: take photos, confirm location, submit data (a detailed bilingual operation manual is provided) 3. Maintain stable mobile network and normal positioning during collection to ensure data accuracy Part-time Perks • Salary Guarantee: Transparent and traceable settlement process, no reduction of rights and interests • Easy to Get Started: Step-by-step manual guidance, no skill threshold, accessible to everyone • High Flexibility: No geographical restrictions across the UK, adjust collection locations and time at any time • Standardized Process: Fully transparent task allocation, operational standards and data review Notes 1. Maintain a stable mobile network throughout the work to ensure the APP runs normally and data is synchronized 2. Authorize the APP to access location permissions to assist in accurately locating collection points 3. A complete bilingual (Chinese & English) operation manual will be provided after onboarding, covering the entire process from login to submission Application Method Please send the following information to the designated application channel: [Your Full Name + UK City of Residence + Mobile Phone Model & Android System Version] Review Timeframe: Qualification verification will be completed within 24 hours (focusing on verifying the must-meet requirements) Follow-up Communication: After passing the review, we will inform you of task details, operational standards and salary settlement rules as soon as possible We sincerely invite friends in the UK who meet the device requirements to join us and earn extra money easily in your spare time!
opportunity
Build Make Automation to Source and Shortlist Candidates
Project Title Build Make Automation to Source and Shortlist Candidates for Recruitment Agency Project Overview I run a UK recruitment agency and require a compliant automated candidate sourcing and shortlisting system built using Make (formerly Integromat). The automation must run daily and work across all live job roles listed on our careers page. The objective is to surface high quality, relevant candidates without hours of manual searching. This is not a scraping project. Any solution that bypasses platform restrictions or login walls will not be accepted. Objectives Reduce manual candidate sourcing time Automatically find potential candidates from public sources Feed sourced candidates into an existing CV scanner and scoring system Produce a daily shortlist per job role Core Requirements Built entirely using Make Runs daily on a schedule Automatically pulls all live job roles from our website Loops through each job role individually Generates role specific search queries Uses search APIs only, for example Google Custom Search API or SerpAPI Explicitly excludes LinkedIn, Indeed, Reed, CV Library, or any login protected platforms Extracts public profile text or CV links Pushes candidate data into an existing CV analysis and scoring pipeline Automatically shortlists, reviews, or rejects candidates based on defined rules Role Based Configuration The system must allow configuration without code changes for each job role: Mandatory keywords Desirable keywords Location terms or regions Excluded keywords Score threshold for shortlisting Maximum candidates per role per day Output and Storage Store candidates in Google Sheets, Airtable, or an existing database Required fields: Role ID Candidate name, where available Source URL Extracted text Score Missing mandatory requirements Status, New, Shortlisted, Reviewed, Rejected Date sourced Send one daily email summary showing top candidates per role Compliance Requirements No scraping of LinkedIn, Indeed, or job boards No browser automation or login bypassing Public data only Respect robots.txt where applicable Nice to Have Modular Make scenarios, not one long chain Clear documentation explaining how to add or remove roles Logging and error handling Duplicate prevention logic Deliverables Fully working Make automation Role configuration sheet Storage setup Daily shortlist email Basic documentation Application Requirements When applying, please include: Examples of Make automations you have built Confirmation you understand and will comply with the sourcing restrictions Estimated timeline and fixed price Any clarification questions after reading this brief https://www.h2orecruitmentservices.co.uk/careers
opportunity
We are preparing to build a Data Center Construction Intelligence Platform. Before development begins, we need an initial, authoritative dataset covering previously built and currently active data center construction projects in the U.S. This dataset will be the foundation of a platform that is designed to update monthly over time. This role is research and data structuring only — no application development, no scraping tools, and no automation required. Data Sources (Representative, Not Exhaustive) Research should be based on reputable industry sources, such as: BuilderConnected Dodge Construction Network CRANE Construction Intelligence Public owner / developer disclosures Industry reports and trade publications Publicly available permitting and project announcements ⚠️ Important: You are not required to have paid access to all platforms. We are looking for structured summaries and metadata, not proprietary dumps. Scope of Work 1️⃣ Data Center Project Inventory Compile structured information on: Previously built data centers Currently active / under-construction data centers Where available, capture: Location (city, state) Project status (completed / active) Facility type (enterprise, hyperscale, colocation, edge, etc.) Approximate scale (high-level) General delivery model (design-build, CM, etc.) 2️⃣ Construction Systems & Assemblies (Project-Informed) Using real-world project data, organize: Structural systems Electrical & power infrastructure Mechanical / cooling strategies Fire protection approaches Low-voltage & networking considerations Focus on patterns and common approaches, not engineering drawings. 3️⃣ Tier-Level & Redundancy Context (High Level) Where identifiable, note: Tier I–IV alignment (if stated or implied) Redundancy concepts (N, N+1, 2N, etc.) How redundancy impacts construction scope This should remain descriptive, not technical design. 4️⃣ Cost & Schedule Drivers (Qualitative) Based on project patterns, identify: Primary cost drivers Schedule risk factors Regional labor/material sensitivity Supply chain and lead-time influences No exact pricing required. 5️⃣ Structured, App-Ready Data Format (Critical) All information must be delivered in structured spreadsheet format, with clearly defined columns such as: project_name location project_status facility_type system_category assembly_or_component tier_or_redundancy_level primary_cost_drivers schedule_risk_factors data_source notes This dataset will later feed an application and AI system. Deliverables You must provide: Google Sheets or Excel file(s) Clean, consistent structure and naming CSV export Short summary document explaining: Data sources used Assumptions and limitations Recommendations for monthly updates No PDFs. No slide decks. No scraped raw dumps. Ongoing Monthly Updates (Future Work) After the initial dataset is delivered, we plan to: Update the dataset monthly Add newly announced or completed projects Refine patterns as the dataset grows Please indicate in your proposal: Whether you are open to a monthly update engagement Your estimated monthly fee range for ongoing updates Ideal Candidate Experience with construction, infrastructure, or industrial research Familiarity with construction intelligence platforms or trade data Strong data organization skills Methodical, detail-oriented approach Comfortable citing and tracking data sources Screening Question (Required) How would you gather and structure authoritative construction project data so it can be updated monthly and later used in an application or AI system? Generic responses will not be considered.Selection Criteria We will prioritize candidates who: Demonstrate structured thinking Understand construction project data Respect data-source boundaries and licensing Clearly explain how they will organize and maintain the dataset
Role-Play Bank Call Audio Recording Project (Malay–English)
We are looking for native Malay speakers based in Malaysia to participate in a role-play audio data collection project. The task involves simulating bank call-center conversations using provided scripts. This is strictly a role-play project — no real banks, customers, or sensitive information are involved. Audio recordings are captured automatically through the client’s IVR system. Participants only need to place calls and follow the instructions. Language Requirement Malay (local Malay mixed with English) Native fluency is required What You Will Do Review a short role-play script provided by us Adapt the script naturally to sound local and conversational Place a call to a designated number Enter a unique IVR ID when prompted Perform one role per call (Agent or Customer) Speak naturally, as in a real call-center interaction Pause briefly after completing the dialogue and end the call ✅ No manual recording ✅ No audio uploads ✅ All audio is captured automatically Requirements Native Malay speaker currently residing in Malaysia Age range: 25–60 years Clear pronunciation and natural speaking tone Ability to role-play realistic call-center conversations Comfortable adjusting AI-generated scripts to sound natural Quiet environment with minimal background noise Headphones are not permitted Maximum participation time: 40 minutes per speaker Project Details Project Type: One-time project Work Type: Remote Experience Level: Entry level Payment: Fixed price (per participant) Potential: Future similar tasks may be available based on performance Skills (Select on PeoplePerHour) Voice Recording Voice Acting / Voice Over Audio Data Collection Communication Skills Data Entry (basic)
Single airline Flight Data web scrape
I need someone to provide me with a list of all Emirates flights on 21st November 2025. The list must include Point of departure, destination, flight time, aircraft type and callsign. AI tells me that Emirates operates around 500 individual flights per day. The data can be obtained from Flight Radar24 for which I can provide a temporary Gold subscription log in. I have provided a blank copy of the excel spreradsheet that needs completing with the data.
Easy Online AI Tasks
We’re seeking individuals to assist in training and enhancing an AI system. The work is simple, online, and flexible. Must be based in Tier 1 countries IF you enjoy chatting, answering questions, or doing small creative tasks. What you’ll actually be doing: - Chatting with an AI and giving feedback - Answering questions to help the AI learn - Editing or reviewing simple text and images - Checking if AI responses make sense and fixing small mistakes No advanced tech skills needed — if you can read, write, and pay attention, you can do this. Project lasts 1–3 months (with possible extension) Must be based in Tier 1 countries (US, Canada, UK, Australia preferred — also open to Europe & the Americas)
Research Study: Booking Communication Samples (Vietnam & Norway)
This research project studies authentic booking communication patterns in travel/hospitality, etc. Vietnam/Norway-based freelancers can submit EML files of real booking confirmations in Vietnamese or Norwegian. Our team handles full anonymization—no personal data needed or stored.
opportunity
EPR/WEEE Data extraction
I require ever quarter annually data extraction and consolidation for WEEE data and submission and every 6 months data extraction and consolidation, you will be extracting data from spreadsheets and reports to consolidate in to returns. the data is small and easy to extract in to the final forms which produce numbers that i need to put in to a web based system. the rate for this will be £30 for the quarterly so x 4 and the EPR data extraction is every 6 months so x 2 £50 per year. Happy to provide a few samples of the data to be used for extraction and the forms the data is to be placed, payment will be invoiced after each return is completed?