Etl Projects
Looking for freelance Etl jobs and project work? PeoplePerHour has you covered.
Power BI Developer
Key Responsibilities: Develop functional & operational reports & dashboards. Build automated reports and dashboards with the help of Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. Be experienced in tools and systems on MS SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, Power BI, and DAX Be able to quickly shape data into meaningful reports and analytics solutions. Have knowledge of database fundamentals such as multidimensional database design, relational database design, and more. Analyse data sources and ensure data quality and integrity. Design and develop efficient data models in Power BI, ensuring optimal performance and scalability. Transform raw data into meaningful insights by implementing appropriate data transformations. Develop visually appealing and user-friendly dashboards, reports, and interactive visualizations in Power BI. Implement complex calculations and custom measures to meet business requirements. Leverage DAX (Data Analysis Expressions) to create advanced calculations. Integrate data from various sources into Power BI, ensuring seamless connectivity and data refresh. Utilize Power Query to clean, transform, and shape data for reporting purposes. Collaborate with cross-functional teams, including business analysts, data engineers, and IT teams, to ensure alignment with overall business goals. Conduct regular stakeholder meetings to gather feedback and make necessary adjustments to BI solutions. Provide training and documentation for end-users to ensure effective use of Power BI reports and dashboards. Monitor and optimize the performance of Power BI reports and dashboards. Ability of learning new tools to enhance the data quality and visualizations. Qualifications and Skills: Bachelor’s degree in computer science, Information Systems, or a related field. Proven experience as a Power BI Developer or similar role. Strong knowledge on Azure Data Lake, Data Factory and other data services Strong proficiency in Power BI, including data modelling, DAX, and Power Query. Solid understanding of relational databases and SQL. Experience in ETL processes and data integration. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Microsoft Power BI certification. Experience with other BI tools (Tableau, QlikView, etc.). Knowledge of data warehousing concepts. Familiarity with cloud platforms (Azure, AWS) for BI solutions.
13 days ago22 proposalsRemote
Past "Etl" Projects
Python ETL Expert Needed for Data Transformation Project
Hello We're looking for a skilled Python ETL expert to assist us with a data transformation project. Our goal is to extract, transform, and load data from various sources into a structured format for analysis. We need someone with experience in automating ETL pipelines, handling data quality issues, and ensuring seamless integration. Key Responsibilities: Retrieve data from all sorts of places – databases, APIs, and flat files. Work on raw data, making it clean, consistent, and accurate. Handle missing values, normalize data, and some feature engineering. Load the transformed data where it belongs, ensuring it gets along with our storage solution. Test and validate, ensuring our data stays true to its transformed self. Proven experience in Python development with a focus on ETL processes Strong understanding of data extraction, transformation, and loading best practices. Previous success in automating ETL pipelines for efficient data processing. Excellent problem-solving skills and attention to detail. Python Expertise: Proven experience in Python development with a focus on ETL processes. A deep understanding of data extraction, transformation, and loading best practices. A track record of successfully automating ETL pipelines for efficient data processing. Excellent problem-solving skills and meticulous attention to detail. Delivarbles: structured and organized CSV files, each containing accurately transformed, structured data. An automated and efficient ETL pipeline that can be easily maintained. Regular updates on project progress, ensuring we're always on the same page. If you're a Python ETL expert looking for an engaging project, we'd love to hear from you! Please send a propsal detailing your relevant experience, a brief overview of your approach to ETL projects, and any examples of past work.
MERN Stack dev
Hello, I'm looking for an individual who can work with us on an ongoing basis on various POC projects we have going on. We're wanting to move forward with at least 2 ready to build sites/apps and have the figma designs ready to go. We would expect that you have extensive experience with React, Express and various related services/libraries as well as containerisation and deployment/orchestration techniques. This is crucial as we build our products in a micro-service manner using ECS and so, for example, we would expect our POCs to be built front/back/api etc.... A bonus would be an individual that has experience working with ETL and can work with our AI/ML engineer fairly closely. As well as having worked with Expo to deliver native apps. i would expect this person to join our slack organisation and be working with other devs and designers on an ongoing basis. For the right person there is plenty of new and ongoing work. A brilliant english/native english speaker, high availability in terms of communication and can show verified examples of previous work.
An ETL Project in PySpark
I was looking at your profile and your skills match the requirements I am looking for an ETL Project I am working on, it will only require 2hours of work. Unfortunately the deadline is Tuesday so would love your help asap.
opportunity
CI/CD Pipeline Implementation Requirement Specification
Project: ETL, Reporting and Workflow Application – UK Based Consultancy 1. Introduction We are seeking a qualified service provider to assist in the setup and configuration of a robust CI/CD (Continuous Integration/Continuous Deployment) pipeline for our software project. The purpose of this specification is to outline our project's requirements and expectations for the CI/CD pipeline implementation. 2. Project Overview The PHP / Laravel based application is currently an inventory database with some specialist capabilities for IT Network Management and its roadmaps is to work towards becoming a full CMDB solution with advanced data integrity, integration, data visualisation and workflow capabilities. 3. Scope of Work The scope of work for this CI/CD pipeline implementation includes, but is not limited to, the following components: 3.1. Source Code Management (SCM) Integration o Integrate the CI/CD pipeline with our existing GitHub repository for version control. o Automate triggers for builds and deployments based on code commits and merges. 3.2. Build and Deployment Automation • Implement automated build processes for application. • Set up deployment automation to various environments, including development, testing, and production. 3.3. Automated Testing • Automate unit tests, integration tests, security scans, and other relevant tests as needed. • Ensure compatibility with Selenium for automated functional testing. • Configured test will be provided however please let us know if you have in house experience of configuring test and UAT testing services separately. 3.4. Documentation Management • Automate documentation updates based on code changes and new software versions. • Ensure seamless integration with our existing documentation repository (In House Development) 3.5. Release Management • Implement versioning and release processes, including automatic generation of release notes. • Configured versioning and tagging process in coordination with documentation updates. 3.6. Integration with Third-Party Tools • Integrate with AWS for automated resource provisioning and management. • Ensure integration with SonarQube for code quality analysis. • Include integration to PHP based document management portal • Ensure integration to Selenium • Ensure integration to SourceGuardian 3.7. Notifications and Reporting • Set up notifications for build and deployment status. • Generate and distribute reports on test results, code quality, and pipeline performance. 3.8. Knowledge Transfer • Provide comprehensive documentation of the CI/CD pipeline setup and configuration. • Conduct knowledge transfer sessions for our internal team for ongoing maintenance and support. 4. Project Timeline ASAP 4. Required Contract Base Fixed Price against agreed deliverables in Statement of Work. 5.Provider Qualifications Evidence of completion of successful similar projects including access to case studies and references. 7. Evaluation Criteria We will evaluate potential providers based on their experience, expertise, proposed approach, references, timeline, and budget.
opportunity
I want to build a ETL sytem
Hi! I'm looking for some help for a smaller project. I want to build a web application which lets users send data to bigquery from their different marketing platforms via api's. Interested? Ping and I will let you know some more details. Something similar as - https://www.catchr.io/destinations/big-query https://supermetrics.com/products/bigquery
opportunity
Expert in Tableau
I need an EXPERT in Tableau who can take data concepts and transform them into Tableau design based on scope of the project. Please DO NOT apply if you have ONLY worked on basic drag & drop designs. We would like a partner who we can call upon on a regular basis. Your skills: Tech: understand ETL process, understand sql db, able to extract data using custom sql, Design: Tableau skill sets to create calculations, Immediate need: We have a 1-2 day project to take data that needs transforming via custom SQL and design, custom calcs
Fintech (Embedded finance) and DLT (blockchain)
Expertise required for a fintech (Embedded finance) and DLT (blockchain) project. Required experience includes business ‘light paper’ support and process flow design by analysts and architects with a deep understanding of ‘in-database’ blockchain, blockchain oracles, reverse ETL, data science, analytics and related areas. Involvement in this project serves as precursor to flexible yet long terms engagement for successful applicants.
Data Engineer
As a Data Engineer, you will be responsible for designing, developing, and maintaining the systems and infrastructure needed to process and manage large volumes of data. Your role will involve working with various data sources, transforming data into usable formats, and ensuring data quality and reliability. Additionally, you will collaborate with cross-functional teams to develop scalable data solutions and optimize data workflows. Key Responsibilities: Data Pipeline Development: Design and implement data pipelines to extract, transform, and load (ETL) data from various sources into databases or data warehouses. Develop efficient data integration processes that handle large volumes of structured and unstructured data. Database Design and Optimization: Work closely with data architects and analysts to design and optimize database structures and schemas for efficient data storage and retrieval. Ensure proper indexing, partitioning, and data organization to maximize query performance. Data Transformation and Cleansing: Develop scripts and workflows to clean, transform, and preprocess raw data into usable formats. Apply data quality checks and validation techniques to ensure accuracy and consistency of data. Data Modeling and Warehousing: Design and implement data models for data warehousing and reporting purposes. Develop and maintain data marts and dimensional models to support analytical queries and reporting needs. Performance Tuning and Optimization: Monitor and analyze database performance, identifying and resolving performance bottlenecks. Optimize SQL queries, indexes, and database configurations to enhance system performance and scalability. Data Security and Governance: Implement data security measures, access controls, and data encryption techniques to protect sensitive information. Ensure compliance with data governance policies and regulations. Collaboration and Integration: Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and develop solutions that meet business needs. Integrate data from various systems and platforms to create a unified view of data. Documentation and Documentation: Maintain comprehensive documentation of database designs, data models, ETL processes, and workflows. Document data standards, data dictionaries, and data lineage for effective data management and governance. Qualifications: Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent work experience). Proven experience as a Data Engineer or in a similar role. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, Microsoft SQL Server). Familiarity with big data technologies (e.g., Hadoop, Spark, Hive) and NoSQL databases (e.g., MongoDB, Cassandra). Experience with ETL tools and data integration frameworks. Knowledge of data modeling concepts and dimensional modeling. Understanding of data warehousing and business intelligence concepts. Proficiency in programming languages like Python, Java, or Scala. Familiarity with cloud-based data platforms (e.g., AWS, Azure, GCP). Strong analytical and problem-solving skills. Excellent communication and collaboration skills.
opportunity
ETL: XML file into SQL database e.g. Azure, Snowflake
Stage 1 - 1 file - Extract data from nested XML file, - Transform the data as needed including file name into an extra column and date of creation - Load data into Azure (for now) but we would like to hear if Snowflake etc is better We will connect to the data via Tableau or PowerBI for analysis Stage 2 - Do the same for 1000 files every hour. - Delete duplicate records based on where all lines of information are the same Following agreement with candidate an XML file will be provided as PPH does not allow XML file
Data Quality Test Engineer
Description Experience Level: Expert Part time Data definition specialist to work remotely for an initial 12 month contract, Define requirements for the data definitions to be applied to leverage their Digital Technology Platform Strategy Responsibilities include working with the functions to define the following: Enterprise data and data hierarchy Data conventions that are to be used Guidance for enforcement and criteria to determine compliance Collect, analyse, and validate data prior to it being stored for final use Construct and apply standard statistical analysis and/or financial models to verify data acceptability/accuracy. Ensure data integrity by implementing quality assurance practices, gathering and entering missing data, and resolving any anomalies. Identify areas of improvement in data collection processes or systems and make recommendations to correct any deficiencies. May work as part of or in conjunction with a database/data warehouse team on Extract, Transform, and Load (ETL) processes/projects. Please respond with first line why you love working as a data QA, and your CV for our consideration. Thank you
Data engineering and coding support for UK-based Research Group
International research group is seeking an experienced and resourceful Data Scientist on a flexible, ongoing basis over a period of 12+ months to manage and troubleshoot our existing code base, run programmes to support research and data team needs, and occasionally engage in longer periods of data scraping and analysis. This is likely to require between 2 and 15 hours per week of work, depending on project pipeline and workload. There may be some weeks where no or little input is required, and others where (with advance notice) more collaboration and iteration will be needed. The selected contractor would: - Harness open source data sets, pull data from APIs and scrape data from the web to inform our understanding of cities around the world in areas of innovation, sentiment, mobility and others - Run existing web programmes and codes to provide the research team with data, maps, and other products as needed - Troubleshoot the existing code base where problems arise - Identify opportunities for automation and improvement to data gathering approaches - Occasionally, develop new or expanded approaches We have a strong track record of working in a flexible way with people providing research and data to support the advice that we give to cities and businesses all around the world. The people who have really excelled in similar previous roles have tended to: - Be working on a suite of other projects and/or can be flexible week-to-week on hourly capacity - Seek a mix of straightforward troubleshooting and scraping, and more exploratory and open-ended analysis - Have a natural curiosity and interest in cities, globalisation, and innovation - Have exceptional attention to detail - Possess excellent Python or R skills for web scraping, API connections, ETL and EDA and familiarity with SQL - Demonstrate an efficient and productive approach to working - Have experience with MS packages and G suite (Google docs, Colab, etc.) The selected supplier will also benefit from: - The ability to input into interesting work impacting cities and businesses all over the world - A highly flexible set up - Opportunities to progress within our wider data team, if the circumstances are right To apply, please enclose a CV and one sentence explaining why you are a good fit for this project and your most relevant experience. Shortlisted candidates will be invited to undertake a short paid assignment to assess mutual fit. NB: Budget negotiable dependent on experience.
opportunity
Alation Training and Set Up
I need an Alation expert who can walk me through setting up Metrics Governance in Alation (as distinct to Data Governance), which would include a system / process to - Define & Sign off the metric definition in a business glossary - Define & Sign off the SQL definition, including differences for each department - Store the SQL so that they are used in the ETLs - Monitor which metrics have been defined / created. I have done some Alation training, but I have not seen anything which does what I need it to do exactly.
DATA MODELER/ANALYST
Our client is currently seeking a Data Modeler/Analyst to be responsible for ensuring high quality and relevant data is presented and stored to facilitate Data Analytics that easily provide accurate and clear analytics and insights that will allow the business teams to design, instantiate and report on business initiatives. This job will have the following responsibilities: Identify and adopt enterprise data architecture patterns, concepts, approaches, and solutions in support of business needs. Examine and identify database structural necessities by evaluating client operations, applications, and programming. Prepare accurate database design and architecture reports for management and executive teams. Collaborate with data architects on data warehouse/mart/cube design Design ETL Processes for development Collaborate with data architects on data warehouse/mart/cube design Design ETL Processes for development Design and implement a data governance and design standard strategy. Provide architecture and data capture feedback to front and back-end developers to ensure the right data is captured for analysis and insight generation Develop Executive and Analytical presentations from models. Provide qualitative and quantitative assessments of models including theoretical aspects, model design and data quality/integrity.
Backend developer for the online database analytics service
Company: startup Level: Middle/Middle+/Senior Position: Backend developer About us: an online platform for analyzing blockchain data and building comparative models (both between different blockchains and offchain-onchain). We are looking for a teammate who can work with us to implement a scalable platform. Tasks: 1. integration with different data sources (via API, and ETL). 2. Development and implementation of methods of analysis and indexing of new data sources 3. Developing API for the platform 4. Participating in improving data processes for the team and clients. 5. Creating and atomizing systems to control errors in data retrieval, relevance, and consistency of data flow. 6. Creation of product documentation. What we expect from the performer: - 3-5 years of commercial back-end development experience and solving a large variety of tasks (Node.js preferred) - Experience in working with cloud SQL and building analytics in it (Metabase / Redshift / BigQuery) - Experience working with vast volumes of unstructured raw data (and having fun doing it) - Good understanding of modern architectures and methods for solving these kinds of problems; - Experience in full-cycle (end-to-end) development (from architecture design to production) - Fluent English communication skills. It will be a extra: - Experience and implementation of building ETL processes (both piecemeal and whole) in cloud platforms. - Experience in DevOps processes - Knowledge of mathematical statistics - Knowledge of machine learning theory (classical algorithms) - Knowledge of NLP / CV theory - Experience of participating in hackathons We will be waiting for you with your portfolio to discuss the specific task
Need SQL DEVELOPER PER MONTH
Experience of 4+ years 2. Microsoft SQL Server 2008 and above server side development 3. Knowledge of tools like Tableau and/or any other data visual tools 4. T-SQL (Transact SQL) Stored Procedures, Functions and Triggers 5. Need to have good understanding of performance tuning 6. Understanding .NET code/.net development experience is a plus Responsibilities: 1. Excellent design skills for coming up with the schema that can be usable across multiple customers 2. Design and develop T-SQL procedures, query performance tunings and SSIS packages. 3. Develop underlying data models and databases. 4. Develop, manage and maintain data dictionary and or metadata. 5. Ensure compliance of standards and conventions in developing programs. 6. Design, develop and implement complete life cycle of data warehouses. 7. Translate business requirements into software applications and models. 8. Analyse and resolve complex issues without over sight from other people. 9. Perform and execute data extraction, transformation and loading using ETL tools. 10. Maintain and enhance the existing data warehouse, exports, and reports 11. Perform quality checks on reports and exports to ensure exceptional quality 12. Create and maintain documentation for all projects
opportunity
SSIS, Creation of a new ETL package
We currently have a ETL package that runs a number of tasks , those tasks reviews the data for accuracy and appends the data within that file to that files relevant sql table. we require the creation of a new ETL package and Server Agent job (or update of the Server Agent job to reflect a second step) The ETL steps should be identical to those taken by the current job (albeit, for only 4 files)
ETL task to get data from a MySQL DB to CSV
ETL task to get data from a MySQL DB to CSV including creating a CRON task to upload the data via SFTP daily. Ideally a MySQL developer or an R developer
opportunity
Expert Database Design, ETL, Python scripting
I need someone to help me with mapping and merging data from different sources inside a database. Suitable candidate needs to have atleast 2-5 years of ETL and Databases experience.
SSIS package development
Hello: I need some assistance with SSIS package to develop for one of our client - it is a simple ETL job loading from Azure DB to another Azure DB with a temporary table and column mapping. We would need the person to work on site in London (we are next to East Croydon station. Please contact us. Thank you. Eric