CETI AI July Townhall Recap

In our latest community town hall, CETI AI shared major strides in the development of the Intelligent Compute Fabric (ICF) and unveiled what lies ahead for the remainder of 2025. If you’re a dev, builder, or curious follower of decentralized AI infrastructure, here’s what you need to know.  

Current Focus Areas

The dev team has been focused on two core components:

The Event-Based Database System

We’ve moved away from traditional SQL databases to a new append-only log system inspired by blockchain design principles. This upgrade:

  • Eliminates concurrency issues,
  • Enables real-time tracking of hardware activity,
  • Forms the foundation for transparent billing and task assignment,

Shout-out to Marcos for his deep-dive demo on this transformation and how we’re leveraging tools like CurrentDB to maintain state integrity and allow auditable compute activity.

Federated Kubernetes Scheduling

Our autoscaler can now deploy workloads on remote clusters—even behind firewalls—via outbound connections. Curtis demonstrated how ICF can remotely control and push workloads to any federated Kubernetes cluster, paving the way for scalable, permissionless compute enlistment.

Why This Matters for CETI AI and the ICF

Previously, adding machines to CETI’s infrastructure required them to be inside our main Kubernetes cluster. That limited flexibility, created security risks, and capped scalability. With our latest architecture:

  • We can now integrate external clusters, including rented machines or 3rd-party providers.,
  • This approach unlocks elastic compute from anywhere—ideal for users with growing AI needs.,
  • Developers or enthusiasts can soon enlist their own GPUs using a simple Helm command, offering transparency even if they’re just crypto mining during early testing.,

What’s Coming Next?

Here’s the development roadmap as discussed in the meeting:

August 2025:

  • Finalize the permissionless enlistment flow for external GPUs and clusters.,
  • Anyone with a cluster will be able to connect to the ICF and run real-time workloads (initially without compensation until billing goes live).,

September–October 2025:

  • Billing System Launch.,
  • Tied directly into the event stream system.,
  • Enables provable, auditable payments to providers based on real workload usage.,
  • Powered by our partnership with Wire Network for payout processing.,

Q4 2025:

  • Expand DevOps-as-a-Service with growing AI application partners like Morpheus, Alpaca, and Dula.,
  • Integrate more demand sources into the ICF via API gateways.,
  • Enable real monetization for hardware contributors through seamless autoscaling and job routing.,

Growing Ecosystem and Biz Dev Efforts

We’ve been laying the foundation to support a variety of partners:

  • AI video generation tools (e.g. Dula’s frame-consistent character rendering),
  • Model marketplaces like Morpheus and Alpaca,
  • Inference job aggregators like Psyc and Sire,

Our value proposition is clear: Let CETI handle the infrastructure while partners focus on innovation. We aim to become a global AI switchboard, routing compute from underutilized GPUs to high-demand inference pipelines—automated, auditable, and efficient.

If you’d like to see a full transcript of all of our Town Halls, please join our Discord and verify yourself as a holder to access.

Scroll to Top
Clicky