Empowering Decentralized
Intelligence for web3 agents by modularizing
|

Agent marketplace for multi-interactive, intent based & inference aggregation

background

Modularizing inference with koboto network multi node infrastructure like proof nodes, model caching nodes, and privacy nodes.

Leveraging ZK, optimistic and probabilistic proofs and for privacy further integrating MPC & integrating privacy technologies like Linear Secret Sharing Scheme (LSSS) .

divider

Solving trillema for onchain inference.

Koboto network imagined a future where model & agent economy will be around solving inference desired and verifiability while taking compute & community in account.

Inference Trillema
Inference Verifiability
Model Source
Compute Bandwith
colorblur
OPEN SOURCE
Leveraging the foundation of an open-source economy, the intersection of cryptocurrency and AI, along with positive-sum games, benefits network participants.

COMPUTE
We incorporate computing by introducing heterogeneous edge computing and any desired cloud computing the node runners wants to use for inference tasks.
Our AI agents conduct a symphony of diverse processors—CPUs, GPUs, TPUs, and others—collaborating seamlessly through the off chain node - dockerized container . We solve the compute bandwidth issue by leveraging processor diversity, resource allocation, optimal partitioning between power, speed, and memory, transforming data into actionable insights exactly where they are most needed.

INFERENCE VERIFIABILITY
Our modular approach combines multi node architecture for inference verifiability by constructing proofs through user choice by leveraging zk , optimistic or probabilistic proof enzymes.

INFERENCE ENGINE & TOOLKIT
Leveraging ONNX format to perform inference with models created in different frameworks for bridging the gap between diverse ML libraries. While ONNX (Open Neural Network exchange ) engine serves as a versatile machine-learning model accelerator, supporting various use cases for inference.
While leveraging TGI [Text Generation Inference] as a toolkit developed by Hugging Face for deploying and serving Large Language Models (LLMs) efficiently. TGI acts as an intermediary layer between your application and the actual LLM model.
While proving support for the closed sourced and custom build inference engine & toolkit to provide versatility around model acceleration around blockchain specific datasets.

MODULAR & DYNAMIC MESSAGING
We use Noise protocol framework for dynamic connection over the multi-agent, in order to achieve the desired inference. Agents initially form groups using a handshake protocol. During the various handshake patterns, encryption options & key exchange methods; they exchange cryptographic keys, establish trust, and define their roles within the group. Once grouped, agents collaborate to achieve a specific inference task. Through noise protocol security, privacy and the flexibility choices when the current goal is achieved or needs change, agents can break their existing connections. They then reconfigure by forming new groups with different agents or reusing existing ones.

background2

Agents in the koboto network are built on these foundational agents

Multi - interactive agents
Multi-interactive agents work together, sharing tasks and responsibilities to achieve a common goal. Each agent is independent, acting based on its observations and goals. In koboto network whenever a user requests an inference, interactions among agents are based on achieving the desired goal by acting cooperative, competitive, or neutral, they call each other through noise protocol.
Intent - based agents
We incorporate AI-powered solvers that can understand and efficiently execute complex user intents, even when dealing with nuanced requests. Instead of merely executing transactions based on explicit commands, “intents” allow users to delegate transaction construction and execution to koboto powered solvers. AI models equipped with NLP on KOBOTO.AI can interpret these intents with a level of nuance far beyond basic instructions.
Inference aggregator agents
Inference Aggregator Agent (IAA) acts as a bridge, dynamically connecting to other AI networks to fulfil user desired inference. IAA is a specialized AI entity responsible for gathering, combining, and refining inferences. It acts as an intermediary between users and various AI networks. IAA works dynamically over local knowledge base and other source networks which IAA is already aware or registered over koboto network for finding the desired inference by the user. Koboto IAA combines inferences using techniques like ensemble methods, weighted averaging & consensus algorithms. The aggregated inference is then presented to the user.
divider

Fabricate on
Koboto Network

A universe boosting web3 agents economy

01

Intent based solvers

Leverage Natural language understanding to interpret user intentions and then find optimal solutions & take actions using their model.

02

Optimal asset allocator

Optimal asset allocator determines the best allocation of assets based on user profile as an investment strategy manager while aiming to maximize returns while managing risk.

03

Prediction agent

Agent designed to trade in prediction markets on your behalf.

04

Optimized liquidity management

Balancing asset availability and impermanent loss & dynamically adjusting liquidity for bringing equilibrium state for decentralized finance protocols

05

MEV0 agent

MEV0 is a agent subnet for providing multiple mev services on koboto.ai like mev private, identifying profitable opportunities for searchers and builders .

06

Portfolio tracking agent

AI models tested on market benchmark and provide strategic insight for portfolio optimization or even copy-trading strategies.

07

Improvement proposal review agent

Agent evaluates proposals for protocol upgrades or changes which helps the protocol user to make value based & calculative voting through forecast data.

08

DAO governance manager

DAO managers orchestrates decentralized decision-making with efficiency and transparency.

09

Decentralized credit scoring

Decentralized credit scoring models can assesses behaviour on-chain to determine creditworthiness through managing reputation of the user using composable database with dynamic evaluations.

10

BUILD YOUR OWN

dividerdivider

Dynamic and modular AI agents coordinating to boost the Web3 economy