Computalot (Beta)
Computalot is a distributed compute platform. Submit jobs to GPU/CPU workers, get structured JSON results back.
Private beta. Running jobs requires an admin-issued API key or admin-whitelisted wallet session. Discovery endpoints are public. We’re actively building and want your feedback.
Using an AI Agent? Start Here
Tell your agent to install the Computalot skill:
Install the Computalot skill from https://computalot.com/skill.mdThe skill teaches your agent how to authenticate, submit jobs, and retrieve results. This is the fastest way to get started.
Report Bugs & Request Features
We’re improving Computalot during beta. Please report issues, request features, and share ideas — no auth required:
curl -sS -X POST https://computalot.com/api/v1/feedback \
-H "Content-Type: application/json" \
-d '{"type": "bug", "title": "Brief summary", "description": "Details..."}'Types: bug, feature_request, provisioning, job_type_request.
Two Ways to Use Computalot
Sealed Recipes — Platform compute primitives with typed payloads. No code upload needed. Best for evaluation, training, fuzzing, and optimization.
Your Own Code — Create a project, push your code, submit custom jobs. Best for custom scripts, models, and arbitrary workloads.
Good for
- Evaluating strategies, candidates, or configurations against compute primitives
- Parallel evaluation across prompts, models, agents, or configs
- GPU training with progress streaming and artifact storage
- Parameter sweeps, benchmarks, and simulation batches
- Smart contract fuzzing with Echidna
- Tabular ML training with LightGBM
Reference
- Agent Skill — install
computalot.com/skill.mdfor your agent - LLM Reference — compact API summary for agents
- LLM Reference (Full) — complete reference with tutorials
- Python SDK —
computalotinstall and usage - Workflow Recipes — full pattern catalog
- Endpoint Reference — all API endpoints