Database management system (DBMS): MySQL
Description of requirements/functionality: Goal: Creation of a real-time betting odds comparison engine/database.
The platform will receive real time odds data from a number of different sources (XML feeds, API updates, Push/Pull subscription, scraping, etc.) from a number of different betting providers. Messages are buffered in AMQP queues and inserted into data source specific data marts.
We intend to create a single unified view of all data sources using a new matching system, that has to:
- Process about 100-500k records per minute.
- Be resilient to failures of individual data sources, replacing them with fallback options as required.
- Be highly configurable, but will utilize automated matching methods where possible.
The aim of the platform is to provide near real time information of combinations that allow our users to identify the most profitable betting strategies such as:
- Matching bookmaker and exchange odds.
- Identification of profitable accumulator combinations.
- Three way matching of outcomes with bookmakers.
We will push all data updates to bespoke data marts for these applications, that support:
- Reactive data updates
- Searches and the definition of filters on the data.
- Events and Alerting
The aim of the software is to provide a tool that finds close matches between “Back Odds” provided by the bookmakers, and “Lay Odds” provided by the betting exchanges.
The matching process will need to find all relevant data for specific events or betting outcomes (for example, matching up all back odds for “Manchester City” to win a specific football match against relevant lay odds for the exact same outcome).
The matching logic will be written in Python. It will need to support variations in team names (e.g Man City, Manchester City, Manchester City FC) by way of some central database of acceptable team name variations.
Data is ultimately output into a final table that compares bookmaker back odds against exchange lay odds for the same event. http://puu.sh/qxupQ/1a3eba4969.png
- Experience in data processing and analysis. (R, Python)
- Experience using object orientation and OOD in at least one programming language.
- Understanding of database modelling and data warehousing.
- Good understanding of MySQL (including indexing, partitioning and stored procedures)
The following additional skills will be advantageous;
- Knowledge of sports or betting applications
- Sports Data Query Language (SDQL)
- DevOps experiences with system and application monitoring
- MySQL administration and configuration management especially InnoDB performance tuning.
- Experience with common XML/JSON APIs
- Experience using AMQP applications such as RabbitMQ, Apollo, etc.
Extra notes: We are looking for highly experienced senior developers. We have a very large budget for this project and will likely recruit multiple developers to work together along with one of our existing devs.
Max V.0% (0)
Login to your account and send a proposal now to get this project.Log in
Clarification Board Ask a Question
is it somewhat related to arbitrage betting? Also confirm if any of targeted website uses cloudflare and need to be scraped?Max V.15 Aug 2016
Hi - yes it's related to arbitrage betting. We get most of our data from third party sources so scraping should not be required for the majority.Samarjeet S.15 Aug 2016
Thanks for response,
So there will be n number of sites and data will be managed in a data mart using AMQP. Task is to process data to arrange against same events from different sources and apply some logic over it and then show in one view.
Let me know if I'm getting it right?