
Email AI Tool and LLM Integration Project
- or -
Post a project like this42
£1.0k(approx. $1.3k)
- Posted:
- Proposals: 36
- Remote
- #4449174
- OPPORTUNITY
- Awarded
Full-Stack Web & Mobile App Developer With AI Integration & Automation Expertise
♛ PPH No. #1 ♛ 12 Years of Experience in Web & Mobile Development & Designing ✔ Magento ✔ Shopify ✔ WordPress ✔ API Integration ✔ React Native ✔ AngularJS / Node.js ✔Responsive Design


Data Science & Machine Learning Engineer | Web App developer | AI Application Development
801493512124184128760721290382011608108117000512844907123085902824956122088381235216512486046
Description
Experience Level: Expert
Estimated project duration: 1 - 2 weeks
This work involves taking two existing AI tools and turning them into live products on our website. The first is the AI Email Prediction Engine, which needs to be uploaded to the site, given a full user interface, and connected to a database so lookups, credits and billing can be managed. The second is the LP Mandate LLM (essentially a LLM that contains a database of information that clients will query). Both tools already exist in working form, and the remaining job is to integrate them properly, deploy them and maintain their underlying databases.
PART 1 – AI Email Prediction Engine
1. What the Product Is
This is a prediction engine that identifies the most likely email address for any investor based on their name, company and domain, with an optional verification-style signal similar to Hunter.io. The final product will be a full SaaS tool on the AIP website where users can register, use a small free trial, buy credits or subscriptions, run single or bulk lookups, and download results. It will behave like a commercial email-finding platform designed specifically around AIP’s investor data patterns.
2. What We Need to Build
To turn the model into a usable product, the following must be developed:
A full web interface integrated into the AIP site.
User authentication including signup, login, password reset and email verification.
A credit system where each lookup consumes credits, with a free starter allowance.
Stripe billing for subscriptions and credit packs.
Single lookup and bulk lookup features.
Bulk upload processing for CSV and Excel files, running through a queue system.
Postgres storage for completed jobs and history.
A downloadable results function.
A user dashboard and history page.
An admin console for adjusting user credits and reviewing activity logs.
Secure FastAPI backend handling predictions, credit logic, job scheduling and invoices.
Deployment of the C++ inference engine and backend on AWS.
3. What Has Already Been Done
The core AI model and inference system are fully complete. The engine is written in C++ and exposes two prediction functions. This is wrapped in Python so that it can be served through FastAPI as REST endpoints. A working demo already exists showing a basic front end sending prediction requests to a remote server. All data engineering work has been completed: cleaning, preprocessing, feature engineering and pattern mining. These components are modular and documented. There are one-click scripts that turn raw data into final training sets.
The training notebooks exist and the model is already trained, with weights stored in the repo and ready for immediate inference.
4. Developer Coordination Required
You will need to liaise with the original engineer who built the C++ inference engine and Python wrapper, as well as the engineer who built the data pipelines and training system. Each holds domain-specific knowledge that will make integration and deployment significantly smoother.
PART 2 – LP Mandate LLM
1. What the Product Is
This is a mandate-intelligence platform built around Claude, supported by a vector database containing AIP’s mandate documents. Users will query LP and GP mandates across private equity, venture capital, hedge funds and real estate through a branded chat interface on the AIP website. Subscription tiers will dictate which asset classes users can access. The model will use AIP’s embedded documents to produce grounded, investment-safe answers.
2. What We Need to Build
To productise this:
A dedicated chat-style interface on the AIP website.
Login and subscription gating.
1-, 3- and 6-month plans with tiered access to different asset classes.
A secure backend proxy for all Claude calls.
Rate limiting, usage logs and audit trails.
Chat history, pinning and export features.
The vector retrieval system that feeds mandate-specific context into the model.
The final deployment within a new Intelligence section of the AIP website.
System prompts, tone rules, disclaimers and compliance guidance implemented consistently on every call.
3. What Has Already Been Done
The conceptual structure is complete. Claude has been selected as the underlying model. The retrieval architecture has already been defined and will rely on embeddings stored in a vector database. Tone, compliance and formatting requirements are already drafted. The overall system behaviour is fully mapped out: users will query mandate preferences across strategies, and the LLM will respond using AIP’s embedded documents and structured prompts.
4. Developer Coordination Required
You will need to work closely with the person who prepared the LLM architecture and prompt framework, as well as the person managing the mandate data and vector embedding process. Each owns core pieces that must be integrated properly for the final product to function.
PART 1 – AI Email Prediction Engine
1. What the Product Is
This is a prediction engine that identifies the most likely email address for any investor based on their name, company and domain, with an optional verification-style signal similar to Hunter.io. The final product will be a full SaaS tool on the AIP website where users can register, use a small free trial, buy credits or subscriptions, run single or bulk lookups, and download results. It will behave like a commercial email-finding platform designed specifically around AIP’s investor data patterns.
2. What We Need to Build
To turn the model into a usable product, the following must be developed:
A full web interface integrated into the AIP site.
User authentication including signup, login, password reset and email verification.
A credit system where each lookup consumes credits, with a free starter allowance.
Stripe billing for subscriptions and credit packs.
Single lookup and bulk lookup features.
Bulk upload processing for CSV and Excel files, running through a queue system.
Postgres storage for completed jobs and history.
A downloadable results function.
A user dashboard and history page.
An admin console for adjusting user credits and reviewing activity logs.
Secure FastAPI backend handling predictions, credit logic, job scheduling and invoices.
Deployment of the C++ inference engine and backend on AWS.
3. What Has Already Been Done
The core AI model and inference system are fully complete. The engine is written in C++ and exposes two prediction functions. This is wrapped in Python so that it can be served through FastAPI as REST endpoints. A working demo already exists showing a basic front end sending prediction requests to a remote server. All data engineering work has been completed: cleaning, preprocessing, feature engineering and pattern mining. These components are modular and documented. There are one-click scripts that turn raw data into final training sets.
The training notebooks exist and the model is already trained, with weights stored in the repo and ready for immediate inference.
4. Developer Coordination Required
You will need to liaise with the original engineer who built the C++ inference engine and Python wrapper, as well as the engineer who built the data pipelines and training system. Each holds domain-specific knowledge that will make integration and deployment significantly smoother.
PART 2 – LP Mandate LLM
1. What the Product Is
This is a mandate-intelligence platform built around Claude, supported by a vector database containing AIP’s mandate documents. Users will query LP and GP mandates across private equity, venture capital, hedge funds and real estate through a branded chat interface on the AIP website. Subscription tiers will dictate which asset classes users can access. The model will use AIP’s embedded documents to produce grounded, investment-safe answers.
2. What We Need to Build
To productise this:
A dedicated chat-style interface on the AIP website.
Login and subscription gating.
1-, 3- and 6-month plans with tiered access to different asset classes.
A secure backend proxy for all Claude calls.
Rate limiting, usage logs and audit trails.
Chat history, pinning and export features.
The vector retrieval system that feeds mandate-specific context into the model.
The final deployment within a new Intelligence section of the AIP website.
System prompts, tone rules, disclaimers and compliance guidance implemented consistently on every call.
3. What Has Already Been Done
The conceptual structure is complete. Claude has been selected as the underlying model. The retrieval architecture has already been defined and will rely on embeddings stored in a vector database. Tone, compliance and formatting requirements are already drafted. The overall system behaviour is fully mapped out: users will query mandate preferences across strategies, and the LLM will respond using AIP’s embedded documents and structured prompts.
4. Developer Coordination Required
You will need to work closely with the person who prepared the LLM architecture and prompt framework, as well as the person managing the mandate data and vector embedding process. Each owns core pieces that must be integrated properly for the final product to function.
Projects Completed
193
Freelancers worked with
136
Projects awarded
30%
Last project
29 Jan 2026
United Kingdom
New Proposal
Login to your account and send a proposal now to get this project.
Log inClarification Board Ask a Question
-

Given the short 1-2 week timeline, could you clarify if the AWS infrastructure (specifically the containerization for the C++ engine) and the Vector Database instance are already provisioned - or will setting up the DevOps environment and embedding pipelines be part of this sprint?
1143477
We collect cookies to enable the proper functioning and security of our website, and to enhance your experience. By clicking on 'Accept All Cookies', you consent to the use of these cookies. You can change your 'Cookies Settings' at any time. For more information, please read ourCookie Policy
Cookie Settings
Accept All Cookies