Parse Projects
Looking for freelance Parse jobs and project work? PeoplePerHour has you covered.
opportunity
Parsing email headers
We require a skilled freelancer to develop a solution that seamlessly saves emails from Outlook to SharePoint (365). The primary objective is to extract specific email header information, including the sender, recipient, subject, and received date, and populate it into metadata within the email. This metadata will then be accessible and usable in SharePoint for further processing and organization. The solution should be designed to run automatically upon saving an email into a designated document library, ensuring a smooth and efficient process. The freelancer should possess a deep understanding of SharePoint and Outlook APIs, as well as experience with email parsing and metadata manipulation. A working prototype or demonstration of the solution is expected, along with comprehensive documentation and support to facilitate its implementation and maintenance.
8 days ago21 proposalsRemote
Past "Parse" Projects
pre-fundedurgent
Need Python Code to parse text file - Part 1
I need the code to create a separate text file and write certain pages (or lines) to the new file. I just want the code to review and run myself. I don't need an executable. Part 1: Extract the BEARING SEAT ELEVATIONS pages and write to a newly created text file. I've included two sample files to read from. For the BR07_02.ls2 file there are 6 "Pages" of BEARING SEAT ELEVATIONS. For the Bridge22.ls2 file there is 1 "Page" of BEARING SEAT ELEVATIONS. For these files they happen to be at the end of the file, but assume they can occur anywhere within the text of the file. Need the code to extract that information without the page headers and place all the other lines (non-header info) as-is into a new text file. This is just the first part of the project. There are several other reports within the text file that need to be extracted in a similar fashion. Part 2 of the project will be developing the code to extract all the separate reports from the file. If we can get the Part 1 done satisfactorily, we can discuss Part 2.
opportunity
Azure Function App development
Immediate need for an app architect/senior developer (can be one person or more than one if a team/company) with good experience in Azure Function App design, development, testing, and documentation for an enterprise client I'm working with. Also require previous experience in SPO (SharePoint Online). You MUST have app architect level design and documentation skills and experience in UML. The client's sales division is creating a new SharePoint site where they need security and permissions assignment in a document library automated based upon values in AAD (Azure AD)/Entra. The site will have 2 "classes" of users, those that are 'sales reps' and those that are 'management'. There are multiple sales teams that these users are part of. In brief, the Function App would need to: - Create a new folder in the SPO library for each new sales rep when hired, and set permissions on that folder (and any folders that get created under it) such that only that sales rep, and their manager, and their manager's manager up to the chief sales officer, have access to the content in that folder. Obviously, the further up the org chart someone is in the sales division, the more folders they will have access to for the people that report to them. As such the Function App will need to parse and loop through the hierarchy in AAD/Entra and store those relationships and then use those to assign the appropriate permissions in AAD. - 2nd use case: where someone that is a sales rep gets promoted to a management role. In this use case the Function App would need to parse through the sales rep users that previously reported to another sales manager and realign the permissions on those sales reps folder structures to be under the newly promoted manager. - 3rd use case: a variation on the 2nd use case where someone already working for the company in a different role transfers in at a management level. In this use case the Function App would need to parse through the sales rep users that previously reported to another sales manager and realign the permissions on those sales reps folder structures to be under the transferred in manager. - 4th use case: When any of the users in scope for the Function App leave (voluntarily or otherwise) or transfer to a different department in the company, the client wants all of their content moved over to a different SPO site collection that none of the users in the sales division would have access to and the content being moved would inherit the permissions from that new site.
opportunity
Database populated by reports scraped/parsed from our website
We require the extraction of reports from our website and their compilation into a structured database. Over the years, we have consistently published quarterly newsletters on our website, www.chirp.co.uk, which encompass a range of formatted reports that are readily accessible to the public. Our objective now is to gather all English language reports into a spreadsheet or database format, enabling our website users to conduct comprehensive searches and analyze this data. Each report follows a standardized structure, including elements such as report reference, report title, initial report, our comments, and identified contributory factors. We are seeking a skilled freelancer to undertake this task efficiently and accurately. The successful candidate will have a strong understanding of web scraping techniques and be proficient in using relevant tools and software. It is important that the freelancer possesses excellent attention to detail, can work independently, and meets deadlines. The completed database will be of immense value to our organization in facilitating data-driven decision-making and enhancing the user experience on our website. We are open to discussing the specific details and requirements further and are committed to providing a fair and competitive compensation for the successful completion of this project. Sample newsletters are shown below.
Create a basic python script
Parses the JSON object R1 into a Python dictionary. Prints the parsed Python dictionary onto the console. Modify the parsed Python dictionary so that it no longer contains the key “motd” and its associated value. Print the parsed python dictionary to the console in a table format with rows and columns similar to this And talk me through the above
Collect a database of companies from the UK and Europe
Hello, We require the compilation of a comprehensive database of companies specializing in the production of wooden windows, greenhouses, and doors. The ideal candidate for this task should possess the necessary software tools, expertise in data parsing, and the ability to swiftly gather essential information. The final report must include the company name, country, date of formation, contact telephone number, and email address. The deadline for this project is one week. Offer your bids we will consider successful candidates.
Application Development Architect for 2 page technical writeup
We're looking for a skilled Technical or Solution Architect to create a 2 page solution design document. Yes exactly 2 pages! Your task will involve designing a solution to integrate EPC Open Source data with our business website, focusing on extracting and displaying Energy Performance Certificate (EPC) ratings for UK properties. You'll draft a comprehensive technical document detailing the process flow, API integration, data fetching, and presentation. Experience with RESTful APIs, data parsing, and security measures is essential. If you're adept at creating scalable, efficient solutions and can transform complex requirements into actionable plans, we'd love to hear from you. The EPC data is available via OpenSource - https://epc.opendatacommunities.org/docs/api/domestic . I need it done urgently so please apply if you have good ENGLISH written and oral skills ( for meetings with me). No Chat Gpt relies.
Scraper and inserting products in prestashop
I require an experienced developer to build a web scraper and automatically import new product listings from an external online store into my Prestashop e-commerce platform. The target site contains thousands of active products organized into various hierarchical categories. The developer should construct a scraper using their preferred programming language that is capable of efficiently traversing the target site, parsing relevant product details such as titles, descriptions, images, pricing and attribute values. Duplicate and incomplete product records should be omitted from the data extraction process to ensure clean, high-quality results. Once scraped, the product listing data needs to be formatted and programmatically inserted into my Prestashop store. This involves creating the corresponding categories and subcategories within Prestashop to mirror the target site's taxonomy, then populating them with new products. Images, pricing and other configurable attribute values must be accurately transferred to each Prestashop product object. I expect the developer to follow best practices for web crawling ethics and respect any robots.txt directives. The source site must also remain anonymous for contractual reasons. Please provide your typical hourly rate and an estimated
Parse data between web views
I have an issue which will be a very quick fix for the right person. The issue is an app which has a wkwebview in tab 1. It has 3 boxes, 2 have been hidden on the server side. We load the website and box 1 is visible. We enter a word into the search box in box 1 and the results are generated. We tap one of the results and outside of iOS this result shows up in box 2. As mentioned, box 2 is hidden in tab 1. Box 2 is visible in tab 2. However, the result doesn't show in box 2 in the other tab. ive tried adding cookies but this hasn't helped. More info can be provided. Budget is minimal as this will suit someone who is able to fix this in very short order. The budget is not negotiable so please do not apply if not satisfied with the renumeration. Thanks.
opportunity
Xero input
hello .i am trying to tidy up some accounts and i need about 30-50 purchase invoices adding to xero and some reconciliation done .need to transfer sales invoices from zoho to xero and also parse my email inbox to see if im missing any invoices may lead to regular book keeping
My Wordpress WP Admin and my Website Is down!!
I downloaded a print plugin and activated it and my website doesn’t open, I am on recovery mode in the WP Admin and deleted the plugin… the website is still down though! This is the message: Parse error: syntax error in /home/customer/www/socceriraq.net/public_html/wp-content/plugins/print-google-cloud-print-gcp-woocommerce/includes/User.php on line 11 There has been a critical error on this website. Please check your site admin email inbox for instructions. Learn more about troubleshooting WordPress.
Sorting out website issues (urgent)
Hi Our website has developed parsing or possibly cache issues and we need to get these resolved within the next 24 hours. Thanks
RSS Feed Aggregation and Categorization Web Application
Objective: To develop a web application that captures RSS feeds from multiple sources, extracts comprehensive data including title, article body, images, source, author, etc., categorizes the content, and generates its own RSS feed based on these categories. Functional Requirements: RSS Feed Capture: The application shall integrate with various sources to capture RSS feeds. Upon capturing, the application shall retrieve complete data from the feed, including title, article body, images, source, author, and any additional relevant information. Data Extraction: The captured RSS feeds shall undergo parsing to extract relevant information from each feed entry. Information extracted shall include but not limited to: Title of the article. Body of the article. Images associated with the article. Source of the article (URL or name). Author(s) of the article. Publication date/time. Any other metadata provided by the feed. Categorization: The application shall categorize the extracted content into various predefined or dynamically generated categories based on content analysis. Categories may include but are not limited to: News, Technology, Sports, Finance, Entertainment, etc. The categorization process should utilize techniques such as keyword analysis, machine learning algorithms, or user-defined rules to assign content to appropriate categories. RSS Feed Generation: Once the content is categorized, the application shall create its own RSS feed(s) for each category. The generated RSS feeds shall include the categorized content along with relevant metadata. Each RSS feed should conform to standard RSS specifications and be accessible via a unique URL. Non-Functional Requirements: Performance: The application shall handle a large volume of RSS feeds efficiently, ensuring minimal latency in capturing, processing, and categorizing the content. Response time for user interactions shall be optimized to provide a seamless browsing experience. Scalability: The architecture of the application should be designed to scale horizontally to accommodate increasing numbers of RSS feeds and users. Load balancing mechanisms should be implemented to distribute incoming traffic across multiple servers. Reliability: The application shall be robust and resilient to failures, ensuring continuous operation even in the event of hardware or software failures. Data integrity measures shall be in place to prevent data loss or corruption. Security: The application shall implement authentication and authorization mechanisms to control access to sensitive functionalities and data. Data transmission and storage shall be encrypted to protect against unauthorized access or tampering. Assumptions: The RSS feeds from various sources are accessible via standard HTTP protocols. The application will not alter the original content of the RSS feeds, but rather create its own feeds based on categorized content.
I need to develop online 3D viewer to extract values & preview.
I'm looking for a skilled freelancer to help create a system that contributes to our online platform's capabilities in three significant ways: - Read multiple 3D file formats - Extract useful data from these files - Implement an interactive 3D viewer using WebGL/PHP+ Three.js technology. To provide clarity on my requirements: Key Priorities: - Reading ability for a wide variety of formats, specifically including STEP, STP, SLDPRT, STL,DXF, IPT, X_T, X_B, 3DXML, CATPART, PRT, SAT, 3MF, and JT files. - The focus will be on efficiently processing SLDPRT, STEP, STP, STL formats. - Extract the following values from the uploaded file 1)Dimensions of the object (length, width, height), measured in millimeters (mm). 2)Volume of the object (mm3) 3)Material of the object. 4)Color of the object (optional). - Development of an interactive 3D preview feature using WebGL/PHP+ Three.js technology. Skills & Experience: - Proficient in handling and parsing the mentioned 3D file formats - Experience in WebGL and front-end development for interactive displays - Knowledge in extracting data such as dimensions, surface area, volume, and more from 3D files - Ability to deliver a user-friendly interface for non-technical users to engage with 3D previews The ultimate goal of this project is to enhance user experience and allow for accurate, seamless interaction with 3D models on our platform. I'm eager to work with a developer who can bring this vision to life with efficiency and innovation.
opportunitypre-fundedurgent
Go lang multithreading through txt file
Need a simple go script that will parse through a text file of emails and send mail to each of them with multi threads specified by flags or args. If eg 4 threads are specified, the script must parse 4 emails from text file, send mail to each email, then remove the emails from the text file. The script doesn't need to actually send email as i can implement that after if needed. A simple simulation of it sending should suffice but up to you. if 4 threads are selected, then max of four emails being sent should be running, only moving to next email once a thread is available after sending.
Chrome extension
I need to create the chrome extension . Basically what extension programmer should do: 1. Load parser/crawler lirary from the backend (js file). Send extension version number as well so that correct js file is returned. 2. Run the library (app with parsers). First version has 10 largest ecommerce shops. We start with the Amazon.com. Eventually 1 000 shops covered and all countries. 3. execute Amazon-tailored crawler if user is on any Amazon website e.g. Amazon.com. The code baseline should be modular so that we can easily add more crawlers there. Eventually, there will 1 000 crawlers. 5. Amazon crawler parse page ASIN codes (product ID) 6. Adds ASIN codes in an array and sends the array to the backend REST API 7. There is two cases where you show stamp and what details is available from the product on that page
Build react js app prototype that parses stock info data to user
General function objective: Help create a single file react js app that allows the user to search for a stock ticker. During the search, corresponding stock ticker data pulled from multiple APIs are parsed back. The requested data of the ticker is then displayed to the end-user in basic list format. Since the data being pulled is live-streamed data, the listed data such as the stock price of the corresponding stock sticker searched should be updated every 30 seconds (30,000 milliseconds). This is a raw prototype, so no styling is required other than the way information is listed as seen in the attachment. For now, I would like different data to be pulled from two different free APIs: https://finnhub.io/api/v1/quote?symbol=${stock_ticker}&token=${FINNHUB_API_KEY} https://www.alphavantage.co/query?function=OVERVIEW&symbol=${searchTerm}&apikey=${ALPHA_ADVANTAGE_API_KEY} API access credentials will be provided upon selecting a candidate. App function and layout: On top of the app should be a search bar. Right next to the search bar on the right side should be the enter button. The user can enter what stock ticker they would like to look up. The returned information should be the ticker, stock price, price change, volume, stock description, opening price, last closing price and last open. Each information uses a different API. Listed below expresses which API I would like each data information described above to be pulled from: Ticker; as entered by the user Price; Finnhub.io API Price change; Finnhub.io API (c - pc) Volume; Finnhub.io API Description; AlphaAdvantage.co Opening price; Finnhub.io Last close price; Finnhub.io Please take a look at the attachment apI_1.png for api and data reference example in react js... Because 'Price' is always changing, I would like it to update for the end-user with a new stock price every 30 seconds, as collected from Funnhub.io's endpoint. Please take a look at api_2.png for how I would like the app to look as a raw and basic prototype layout, and no other styling/design.
Machine Learning Model Implementation on Firebase
Mission: Development of a Machine Learning model for email parsing. We are looking for an individual to develop a machine learning model capable of "parsing" emails from different email service providers. The goal is to automate the extraction of key information from these emails and present it in a concise manner. The mission includes the following tasks: 1. Collecting emails from various sources into a Firebase database. 2. Preprocessing and structuring this data. 3. Selecting relevant features. 4. Training a machine learning model. 5. Validating and optimizing the model. 6. Ensuring its integration with email service providers. 7. Deploying the model in production. 8. Implementing automated tests to detect any changes in the email source formats and update the model as needed.
opportunity
Script to create GFS plume line charts using GrADS
I am looking for a script to create plume charts in the style of the one in the attachment. This script should: 1. Use GEFS-data via Gribfilter or sdfopen via NOMADS to download the data for latitude 52.5, and longitude 4.5. 2. Parse the data only using the variable TMP2m. 3. Generate the plume chart showing all ensemble members in light green and the ensemble average in an thicker red line. 3. Write the result as a PNG-image. Please note: when using gribfilter (and g2ctl and gribmap): use a subdirectory to store all grib, ctl and idx-file. I only need the script and an example of the output png, nothing else.
Create demo Android app to fetch data from ODB adapter
ELM adapter data collection app this is a home hobby project, open source, I am using a Raspberry pi server, the project is quick and dirty we will work live together online it is a code sprint I will test and prove the app myself, you may do the same if you can The API is on Github and is public. https://github.com/pires/obd-java-api The endpoint is controlled by me the code will be pushed to GitHub as we go step 1 should take about 2 hours step 2 should take maybe 4 hours the UI is https://jsfiddle.net/rusty1642/hc34zx0r/5/ there will be a persistent state variable to record the app state step 1 register with a server endpoint on button click, user enters username (u)and password (p) registration url = https://brainbox/cgi/register.php takes 2 param u=username p=password either GET or POST returns JSON { "publicDomain":"newton.house", "serverID":"fe138db692ff4bceb5aee6f62c9cfcd7", "rsyncID":"Pu0eeRhR", "timeRemaining":0, "signedIn": 0, "insideLAN":1, "clientIP":"192.168.0.3", "errors":"none" } save public domain name and rsyncID (auth token) to persistent storage on success show new UI panel to select bluetooth connection set a new state flag user will select ELM bluetooth connection on button click on success show new UI panel with status, see UI mock up save bluetooth uuid or guid to persistent storage set up Broadcast listener to monitor presence of the selected bluetooth device step 2 app is running after registration and bluetooth selection, and listening for ELM bluetooth on finding device connect, ping every 5 seconds for engine on on finding engine on stop fast ping start timer or use Thread.sleep(), collect sample data using Pires API. save journey start data (milage, GPS position, app data, car data, time) The sample data will be 5 data points at each poll of the adapter. each datapoint is called with a class and .run() method. on engine stop collect end data (milage, GPS,time), parse into JSON string, send to server endpoint when a connection is available make a POST request to https://newton.house/cgi/glovebox/ep.php with JSON data structure in the body { command: set-car-data, rsyncid: xxxxxxx, // saved auth token start:[ gps: nnn.nnn nnn.nnn, milage: nnnnnn, cartype:x ], ending: [ gps: nnn.nnn nnn.nnn, milage: nnnnnn ], data: [ {int,int,int,int,int},{int,int,int,int,int},{int,int,int,int,int} ... ] } rtns errors:none there is a reset button on click reset button clear all persistent data, reset state variable, and show starting UI