
Data Science & Analysis Projects
Looking for freelance Data Science & Analysis jobs and project work? PeoplePerHour has you covered.
featuredopportunityurgent
Animated Mystery Box Reveal Game for Live Stream (HTML/JS/CSS)
I am looking for a developer to build a standalone animated “mystery box” reveal game that will be used on Facebook Live. This is NOT a customer-facing website. No payments, logins, or user accounts are required. The tool will be used by me only and shown live by filming my laptop screen with a phone during Facebook Live. The game should display 100 mystery boxes on screen. Each box contains a pre-defined prize that I will supply via a CSV or JSON file. During a live, when a winner finds a mystery box, I will click the relevant box number and: - The box animates (shake / unwrap / open) - The prize is revealed with a fun, high-quality animation - The box becomes locked and cannot be opened again The visual style must be bold, exciting, and easy to read when filmed on a phone camera. Animations should be smooth and dramatic (casino / instant win / mystery box style), not basic pop-ups. Technical requirements: - HTML / CSS / JavaScript - Suitable for use in a normal web browser - No backend complexity required beyond loading the prize data - I must receive and own the full source code Nice to have: - Sound effect on reveal - Confetti or glow animation - Simple reset option for new competitions Please include examples of interactive or animated web projects and your estimated cost and delivery time.
Power IB for real estate analysis
We seek an experienced freelancer to develop a sophisticated Power BI system tailored for real estate analysis. The project involves migrating existing Excel functionalities to Power BI, enabling seamless data input and comprehensive analysis of key metrics (including rating units / schemes, including proximity assessments based on postal codes. A well-structured master document for new deals will also be required. Familiarity with the UK real estate market and private equity financial requirements is essential to ensure high-caliber outputs for our client. Your expertise will play a crucial role in delivering an exceptional analytical tool.
an hour ago22 proposalsRemoteopportunity
Data Analyst
Position Overview: We are seeking a Data Analyst to transform raw data into actionable insights. In this role, you will analyze complex datasets, create reports, and develop visualizations to support business decisions and optimize processes. Key Responsibilities: Collect, clean, and organize data from multiple sources. Analyze datasets to identify trends and insights. Create reports and dashboards using BI tools (Tableau, Power BI, etc.). Design visualizations to communicate findings clearly. Collaborate with teams to provide data-driven recommendations. Ensure data integrity through audits and validation checks. Required Skills and Experience: Education: Bachelor’s degree in Data Science, Statistics, or related field. Experience: 1–3 years in data analysis or a related field. Technical Skills: Proficiency in SQL and data analysis tools (Excel, Python, R, etc.). BI Tools: Experience with data visualization tools (Tableau, Power BI). Analytical Skills: Strong ability to interpret data and provide actionable insights. Communication: Ability to present complex data simply and effectively. Nice to Have: Experience with machine learning or predictive analytics. Familiarity with data warehousing and ETL processes.
18 hours ago17 proposalsRemoteData classification
I will share a purely categorical dataset and need it turned into a clear, well-documented end-to-end classification workflow that I can study for academic purposes. Using Python with Pandas, NumPy, scikit-learn, and visualisations in Matplotlib or Seaborn, start with an exploratory review, handle all cleaning and preprocessing (encoding, missing values, feature selection), then build and compare suitable classification models. Sound evaluation—accuracy, precision, recall, F1 or any metric you judge relevant—must accompany the models, followed by a concise discussion of the results and why a particular approach performs best. Please highlight your experience with similar projects when you respond; I value demonstrated know-how over long proposals. Deliverables I expect: • A well-commented Jupyter notebook covering EDA, preprocessing, model training, and evaluation • The cleaned dataset (or the code that generates it) • A brief markdown or slide deck that walks through the methodology, findings, and recommended next steps Clarity of explanation is just as important as model accuracy, as the primary goal is learning from your workflow.
20 hours ago8 proposalsRemoteNeed help with Looker Dashboard in Looker Studio
I have a file where I will need help with Looker dashboards. It is related to Phishing data.
10 days ago13 proposalsRemoteLive Score Reporter (In-Stadium)
About Us We are a new sports technology startup focused on making live football match updates faster and more reliable than ever before. Our mission is to reduce delays in live match information by sourcing updates directly from the stadium, ensuring that goals, key events, and match progress are shared as they happen — not seconds later. Currently, we are in the testing and evaluation phase, exploring how real-time, in-person reporting can improve the speed and accuracy of live football updates. This stage allows us to validate our concept, refine our processes, and measure the true impact of faster live data delivery. We believe that speed matters in live football. By combining human presence at matches with efficient communication systems, we aim to build a solution that sets a new standard for real-time match updates. As we grow, our focus remains on accuracy, transparency, and continuous improvement as we work toward creating a dependable live football update platform. Job Overview We are seeking reliable live update reporters who attend matches in person and can send real-time score updates instantly from inside the stadium. Speed, accuracy, and consistency are critical. Key Responsibilities Attend matches physically at the stadium Send live score updates immediately (goals, halftime, full time, red cards, penalties, etc.) Ensure 100% accuracy and minimal delay Stay focused on the game throughout the match Follow reporting instructions and timing rules strictly Required Skills & Qualifications Ability to attend stadium matches regularly Fast and reliable mobile texting / messaging skills Strong internet connection (mobile data backup preferred) Excellent attention to detail Punctual, trustworthy, and responsive Basic understanding of football (or the specific sport involved) Preferred (Nice to Have) Prior experience in live sports reporting Access to multiple stadiums or leagues Dual SIM phone or backup device Familiarity with WhatsApp, Telegram, or custom reporting app Payment Structure As we are a new startup currently in the testing and evaluation phase, compensation is structured on a per-match basis. Base Payment: €25 per completed match Performance Bonuses: Additional bonuses may be awarded based on the speed and accuracy of live updates Payments are issued after successful completion and review of each match Consistent performance during the testing phase may lead to increased rates and long-term opportunities This structure allows us to fairly compensate contributors while assessing the effectiveness and scalability of our live reporting model.
13 days ago1 proposalRemoteSenior Data Engineer
We are seeking a Senior Data Engineer to design, implement, and optimize data pipelines utilizing Scala, Spark, and Java. The ideal candidate will develop and maintain real-time data processing systems essential for business operations. Collaboration with data scientists and analysts is crucial to understand data requirements and deliver high-quality solutions. Responsibilities include ensuring data quality through robust testing, monitoring workflows, and troubleshooting pipelines. Candidates should possess a degree in Computer Science or Engineering, with proven experience in data engineering, real-time processing, and SQL proficiency. Familiarity with cloud platforms and data governance is preferred. We offer a competitive salary, benefits, and opportunities for professional growth in a collaborative environment. Key Responsibilities: - Design, implement, and optimize data pipelines using Scala, Spark, and Java. - Develop and maintain real-time data processing systems to support business-critical operations. - Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality solutions. - Ensure data quality and reliability through robust testing and validation procedures. - Monitor and troubleshoot data pipelines and workflows to ensure high availability and performance. - Stay current with emerging technologies and industry best practices to continuously improve our data infrastructure. Qualifications: -Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. - Proven experience with Scala, Spark, and Java in a data engineering or similar role. - Strong understanding of real-time data processing and streaming technologies. - Experience with big data platforms and tools such as Hadoop, Kafka, and Flink is a plus. - Proficiency in SQL and experience with relational databases. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Skills: - Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. - Knowledge of data warehousing solutions and ETL processes. - Familiarity with data governance and security best practices.
14 days ago18 proposalsRemoteUpdate 2 Pivot tales on a demo dashboard
Someone had misanalysed what they were asked to pick up - some data fields from using an Excel pivot table. This should be a quick fix for someone who used pivots in their everyday work and projects. Please indicate if you can work on this asap as this is important
18 days ago21 proposalsRemoteBuild Automated Stock Data Pipeline (API + Sheets)
I need a solution to automate stock market data using Finnhub API. The project involves fetching real-time or scheduled stock price data through API integration, processing it using an automation workflow (such as n8n), and storing the results in Google Sheets or Excel. The workflow should be reliable, easy to maintain, and scalable for multiple stock symbols. Clean data formatting and basic validation are required. This project is suitable for a developer with experience in API integration, automation workflows, and data handling.
18 days ago14 proposalsRemoteSistem recomendasi
We are seeking a skilled developer to create a sophisticated film recommendation system utilizing deep learning techniques. The project involves designing an algorithm that analyzes user preferences and movie attributes to deliver personalized recommendations. The ideal candidate should have expertise in machine learning, particularly in deep learning frameworks, and experience with collaborative filtering and content-based filtering methods. A deep understanding of data handling, model optimization, and evaluation metrics is essential. This project aims to enhance user engagement and satisfaction through tailored movie suggestions.
20 days ago16 proposalsRemoteNETWORK 1& cybersecurity administrator
We are seeking a skilled Network and Cybersecurity Administrator to conduct comprehensive network security audits and implement robust hardening measures. The ideal candidate will possess extensive experience in identifying vulnerabilities, assessing risks, and fortifying network infrastructures against potential threats. Responsibilities include developing security protocols, configuring firewalls, and ensuring compliance with industry standards. The freelancer should demonstrate expertise in cybersecurity best practices and possess a proactive approach to safeguarding sensitive data. This project aims to enhance our network resilience and ensure optimal security posture.
22 days ago15 proposalsRemoteSurvey Data Analysis, Reporting & Dashboard Development
I am looking for a skilled Data Analyst / Monitoring & Evaluation (M&E) specialist to support the analysis and reporting of survey data collected for an NGO-style project. The scope of work includes cleaning and validating datasets collected through tools such as KoboToolbox, performing statistical analysis, and producing clear, well-structured reports to support evidence-based decision-making. Key responsibilities include: Cleaning and validating raw survey data (Excel / CSV format) Conducting descriptive and basic inferential statistical analysis Producing summary tables, charts, and visualizations Developing an Excel or Power BI dashboard to track key indicators Writing a concise analytical report (Word/PDF) with findings and recommendations Deliverables: Cleaned and well-documented dataset Analytical report (5–10 pages) Dashboard (Excel or Power BI) Summary of key insights and recommendations Tools preferred: Excel, R or Python, Power BI, KoboToolbox The project is suitable for a freelancer with experience in data analysis, survey research, and M&E reporting. Clear communication, attention to detail, and adherence to deadlines are essential.
25 days ago21 proposalsRemoteopportunity
Salesforce to Power BI Dashboard Conversion
I am seeking a proficient freelancer to transform my Salesforce-based Excel dashboard into a dynamic Power BI dashboard. The project requires the creation of high-quality visuals and the establishment of a seamless data flow from Salesforce to Power BI, encompassing fewer than 100,000 rows across various tables. It is essential that candidates possess relevant experience in similar projects; please highlight your expertise in your proposal. The goal is to deliver an engaging and insightful dashboard that enhances data visualization and analysis.
a month ago33 proposalsRemote
Past Projects
I need someone to input excel formulas to sort data
I have a spreadsheet of addresses and I need someone to input formulas to pull through County and Region onto main sheet.
Excel Specialist – Address Validation & Geolocation (Lat/Long)
We have an Excel spreadsheet containing a list of addresses (450). Each row includes a postal address (sometimes split across multiple fields). These are global locations. We need all addresses to be validated and standardised, and for each location to be assigned geographical coordinates (latitude and longitude). The coordinates should be as accurate as possible at site or street level, not just town-level centroids. The final output should be an Excel file with clean addresses and two additional columns: Latitude Longitude Any addresses that cannot be confidently matched should be clearly flagged for review rather than guessed.
Microsoft Power BI Fabric (Data Engineering) Consultant – Ad-Hoc
I am seeking an experienced Microsoft Power BI Fabric (Data Engineering) Consultant for ad-hoc, on-and-off consulting assignments. The work will consist of short-term engagements focused on troubleshooting, optimisation, architecture review, and implementation tasks specifically within Microsoft Fabric. Scope of Work Microsoft Fabric Data Engineering solutions Performance optimisation and issue resolution Best-practice implementation, solution structuring and troubleshooting Guidance on enterprise-grade Fabric architecture Professional build-out or consulting based on requirements I will provide Note: I will supply all business logic, requirements, and content. Your role is to structure, implement, and optimise the solution professionally using Microsoft Fabric. Required Expertise Advanced, hands-on experience with Microsoft Fabric Strong background in Fabric Data Engineering and analytics workloads Proven consulting experience delivering Fabric-based solutions Ability to work independently on short, focused assignments ⚠️ Important: Please do not apply if your experience is limited to general Power BI, Azure, or Synapse. This role requires deep, specialised Microsoft Fabric expertise at an advanced consulting level. Please post you proposal to manoindi9 gmail com Your proposal or consulting approach are:- Details of the services you offer What post-implementation or post-launch support you can provide Relevant experience with Microsoft Fabric projects
Data Scientist – Measurement & Causal Inference
We are seeking a Freelance Data Scientist (3+ years experience) with strong expertise in statistical inference and causal analysis to support enterprise measurement frameworks. Time Commitment: ⏱ 2–2.5 hours/day | Monday–Friday (No weekends) Key Requirements: Strong knowledge of hypothesis testing, OLS, GLM, and causal inference Hands-on experience with A/B testing & experimental design Proficiency in Python & SQL (statsmodels, scikit-learn, DoWhy) Experience with Databricks or enterprise cloud environments Ability to work independently and collaborate via GitHub (CI/CD) Type: Freelance
Polymarket Wallet Forensics
Job Description I am looking for an experienced Blockchain Data Analyst to help identify and track clusters of wallets on Polymarket (Polygon Network). The goal is to identify multiple wallets that likely belong to the same individual based on funding patterns and behavioral fingerprints. Key Responsibilities Funding Source Traceback: Analyze the initial funding source of specific Polymarket wallets. Determine if the source is a private self-custodial wallet or a Centralized Exchange (CEX) hot wallet. Temporal & Amount Analysis: For wallets funded by exchanges, perform temporal analysis to match withdrawal times and amounts to identify "sibling" wallets (wallets funded in the same batch). Cluster Mapping: Identify "Hub-and-Spoke" patterns where one primary wallet distributes funds (USDC/MATIC) to multiple proxy betting accounts. Exit Point Verification: Track where funds are sent after betting. Identify if multiple wallets "sweep" their winnings to the same exchange deposit address or unique memo ID. Reporting: Provide a spreadsheet or visualization mapping the connections between the target wallet (0xb786...) and any identified related addresses. Required Skills Blockchain Explorers: Expert-level use of PolyScan and Etherscan. Forensic Tools: Experience with Breadcrumbs.app, Arkham Intelligence, Bubblemaps, or Dune Analytics. Methodology: Deep understanding of "Peeling Chains," withdrawal clusters, and exchange deposit memo tracking. Network Knowledge: Strong understanding of the Polygon network and Polymarket’s CTF (Conditional Token Framework) contracts. Preferred Qualifications Experience in "Sybil detection" or crypto-forensics for DeFi platforms. Ability to write custom Python/SQL scripts (Dune) to automate the search for similar funding patterns across the Polygon network. Deliverables A list of all wallets identified as "highly likely" to belong to the same owner. A brief summary of the evidence for each link (e.g., "Funded within 60 seconds of each other," or "Shares the same Binance deposit memo"). (Optional) A visual map of the wallet cluster.
Crypto Forensics for Polymarket Wallets
I am seeking an expert in crypto forensics to analyze and trace money movements from Polymarket wallets. The objective is to identify connections between wallets that may be funded by the same individual. For instance, I have a specific wallet address (0xb786b8......53039cb18d) that I wish to investigate further. The task involves uncovering any additional wallets that share funding sources or exhibit similar transaction patterns. Your expertise in blockchain analysis will be crucial in uncovering these relationships and providing a comprehensive report on your findings.
Business Analyst / Solution Architect
We are building a real estate marketplace platform (web, mobile apps, and admin panel) connecting Buyers, Agents, Builders, Lawyers, and Accountants. We need an experienced Business Analyst / Solution Architect to create complete technical documentation, process flowcharts, and system diagrams based on our existing requirements. This role is documentation only. No development work is required. Scope includes: Functional and technical requirement documentation User roles and permissions Flowcharts for onboarding, listings, matching, hiring, payments, chat, and admin workflows High-level system and module diagrams Deliverables: Technical documentation (Word or Google Docs) Flowcharts and diagrams (Miro, Draw.io, or similar) Timeline: 2 or 3 days
Employee development, satisfaction and engagement
(Employee development, satisfaction and engagement in segregated and mixed-gender environments) I am seeking someone experienced in qualitative analysis and the Gioia method to review and refine my existing codes and themes