AI/ML

About upcoming AI/ML

From the start, we have been determined to make AI and Machine Learning central to the Autonio solution. Traditionally, such advanced tools have been inaccessible to the general public. But with the power of SingularityNET’s decentralized AI infrastructure, we are breaking down that barrier to entry and will work to give all our users access to the power and opportunity that comes with AI.

To that end, we are developing an infrastructure that will serve as a scaffolding for businesses and developers to train and deploy AI agents and services focused on enhancing automated trading performance.

Traders will have the ability to integrate different AI agents and portfolio management strategies to enhance their performance and maximize profits.

Development

Latest update can be checked here.

In addition to a weekly workshops, internal analytics, design and development, we are actively working on incorporating AI into the Autonio Ecosystem. Below are some of the steps we have taken to advance this goal in 2021:

May/June

  • Added logs

  • Do corpus-specific word-level factor analysis and feature engineering and model re-training and re-evaluation on existing data for sentiment analysis

  • Improvement of predictions based on LR with polynomial features

  • Provide integration tests for simulation and backtesting frameworks

  • Resync ALL OHLCV/Kline data for 1 hr period

  • Collectors logging - implement and testing

  • Setup Twitter and Reddit applications for content aggregation for sentiment analysis

April

  • Added more pairs for AI data collection

  • Tested/debugged back-testing on production (holder profits, key errors, market types)

  • Migrated Simulator/Backtest underlying code to use explicitly set tz/timezone information (with code review)

  • Migrated Simulator/Backtest and underlying code to use time_end instead of time_start (with code review)

  • Ensured Simulator can be ""unittested"" with fixed random seed

  • Provided server with S3 bucket in Europe for Kaiko data (uniswap data)

  • Explored if high-frequency LOB data can be collected and accessed with different levels of granularity

  • Started collection Defi-5, AGI and ETH from Binance

  • Evaluated the ML performance using LSTM on price-only data with different historical/prediction periods

  • Migrated LSTM-based prediction framework to use StorageMySQL API data (close price only at first, using full scope of OHLCV and LOB features will do next)

  • Integrated PredictorAPI LR Prototype into backtesting and simulation framework and test it on latest data from the database from BINANE on BTC/USDT

  • Fixed LR Predictor and PredictorEvaluator notebook

  • Made sure that ""no trades"" intervals (trade_count == 0) are skipped when training.

March

  • Implemented Trade, OrderBook, OHLCV data collection / synchronization servers for AI

  • Finalised the code and provide data from BINANCE on BTC/USDT to database

  • Checked the data in database and provide StorageMySQL API to access the data

  • Evaluated the ML performance using LinearRegression vs. other algorithm of choice on 6 months of BTC/USDT OHLCV data from cryptodatadownload site

  • Implemented Prototype of Predictor Python interface based LinearRegression and LSTM

  • Backtested framework on latest data from database from BINANE on BTC/USDT

  • Integrated Predictor Prototype into backtesting framework and test it on latest data from database from BINANE on BTC/USDT

  • Evaluated the ML performance using "LSTM"

Last updated