With the amount of new subnets being added it can be hard to get up to date information across all subnets, so data may be slightly out of date from time to time
With the gradients subnet, they’ve created a platform that makes AI training accessible to everyone, leveraging the power of the Bittensor network. With just a few clicks, anyone regardless of their AI training knowledge can train models on Bittensor. Their platform offers a wide variety of models and datasets, allowing users to easily select and train their AI models, either through a simple user interface or programmatically via their API.
They provide intelligent AI tools that automatically optimise your training process, ensuring the best results with minimal effort. Users can monitor their training progress in great detail through platforms like Weights & Biases and, once ready, deploy their models on Hugging Face. With the ability to fine-tune models using custom data on demand, they’ve made it easier than ever to build, train, and deploy AI models, streamlining the entire process for developers and innovators.
With the gradients subnet, they’ve created a platform that makes AI training accessible to everyone, leveraging the power of the Bittensor network. With just a few clicks, anyone regardless of their AI training knowledge can train models on Bittensor. Their platform offers a wide variety of models and datasets, allowing users to easily select and train their AI models, either through a simple user interface or programmatically via their API.
They provide intelligent AI tools that automatically optimise your training process, ensuring the best results with minimal effort. Users can monitor their training progress in great detail through platforms like Weights & Biases and, once ready, deploy their models on Hugging Face. With the ability to fine-tune models using custom data on demand, they’ve made it easier than ever to build, train, and deploy AI models, streamlining the entire process for developers and innovators.
They’ve built a decentralised platform that allows anyone in the world to train AI models on Bittensor, the largest decentralised AI network. With just a few clicks, users can begin training their models, even without any prior AI knowledge. The process is streamlined with a wide selection of models and datasets, enabling users to quickly select the best options for their tasks. Their intelligent AI automatically helps with column mapping for optimal results, or users can make their own selections.
Training can be initiated either via an easy-to-use interface or programmatically using their API. Throughout the process, users can monitor their model’s progress in detail, with advanced tools like Weights & Biases providing deeper insights. Once the model is ready, users can deploy and use it via platforms like Hugging Face. Additionally, the platform allows users to fine-tune models using custom datasets, all on-demand. This makes training and deploying AI models simpler, more efficient, and accessible to everyone.
They’ve built a decentralised platform that allows anyone in the world to train AI models on Bittensor, the largest decentralised AI network. With just a few clicks, users can begin training their models, even without any prior AI knowledge. The process is streamlined with a wide selection of models and datasets, enabling users to quickly select the best options for their tasks. Their intelligent AI automatically helps with column mapping for optimal results, or users can make their own selections.
Training can be initiated either via an easy-to-use interface or programmatically using their API. Throughout the process, users can monitor their model’s progress in detail, with advanced tools like Weights & Biases providing deeper insights. Once the model is ready, users can deploy and use it via platforms like Hugging Face. Additionally, the platform allows users to fine-tune models using custom datasets, all on-demand. This makes training and deploying AI models simpler, more efficient, and accessible to everyone.
Namoray is one of the premier developers on Bittensor. Other team members include:
Marcus Graichen – Co-Founder
Akinwunmi Aguda – Frontend Developer
Arpan Tripathi – AI Engineer
Nicholas Bateman – Lead AI Engineer
Christopher Subia-Waud – Lead Machine Learning Engineer
Charlotte Hare – Recruitment
Namoray is one of the premier developers on Bittensor. Other team members include:
Marcus Graichen – Co-Founder
Akinwunmi Aguda – Frontend Developer
Arpan Tripathi – AI Engineer
Nicholas Bateman – Lead AI Engineer
Christopher Subia-Waud – Lead Machine Learning Engineer
Charlotte Hare – Recruitment
Q4 2024
Q1 2025
Q2 2025
Q3 2025
Q4 2025
Q4 2024
Q1 2025
Q2 2025
Q3 2025
Q4 2025
Huge thanks to Keith Singery (aka Bittensor Guru) for all of his fantastic work in the Bittensor community. Make sure to check out his other video/audio interviews by clicking HERE.
The team at Rayon Labs have done it again with Gradients led by wanderingweights who joins the pod to discuss how he and his team have democratized AI model building with their “couple of clicks” no-code training subnet. This is one of the most groundbreaking projects on Bittensor that, in only a few months on the network, can already out-train the establishment.
A big thank you to Tao Stats for producing these insightful videos in the Novelty Search series. We appreciate the opportunity to dive deep into the groundbreaking work being done by Subnets within Bittensor! Check out some of their other videos HERE.
In this session, the team from Rayon Labs discuss the latest developments and updates within their ecosystem. The conversation delves into the performance and future improvements of the Gradients platform, focusing on its edge in machine learning and AI model training, where the platform consistently outperforms major competitors like Google and AWS in terms of cost, performance, and ease of use. They also explore the Squad platform, a tool designed for building and deploying AI agents, enabling users to create custom agents with little to no coding experience. The discussion touches on innovations like trusted execution environments (TE) to enhance security and privacy for AI computations. The team also highlights ongoing efforts to scale up their infrastructure, including integrating fiat payments into Chutes, a serverless AI compute platform, and expanding their use of X integrations. The session provides a deep dive into how these tools and technologies are reshaping decentralized AI and the future of machine learning.
An earlier Novelty Search session from 2024, Namoray and the Rayon Labs team provide updates on SN19 and introduce two new subnets: Gradients and Chutes
Novelty Search is great, but for most investors trying to understand Bittensor, the technical depth is a wall, not a bridge. If we’re going to attract investment into this ecosystem then we need more people to understand it! That’s why Siam Kidd and Mark Creaser from DSV Fund have launched Revenue Search, where they ask the simple questions that investors want to know the answers to.
Recorded in July 2025, this episode of Revenue Search features Chris (aka Wandering Weights), founder of Gradients (Subnet 56)—a Bittensor subnet focused on decentralized, high-performance model training. Gradients allows users to upload a dataset, select a model, and have Bittensor miners compete to produce the best-performing version—removing the need for manual hyperparameter tuning or large ML teams. The platform has already attracted 3,000 paying users, primarily hobbyists, and delivers results significantly cheaper and better than major incumbents like Google Cloud or AWS. With a new enterprise-ready version (5.0) launching, Gradients plans to address corporate data privacy concerns by running training internally or through Shoots’ Trusted Execution Environments. All fiat revenue is used to buy and lock up the Alpha token, and the team now aims to shift focus from technical development to business growth and enterprise partnerships.
📑 GRADIENTS WHITEPAPER
How does http://Gradients.io get 82.8% win rates against HuggingFace and 100% against Google Cloud—at 1/200th the cost?
Our whitepaper reveals the formula for AI training that smokes centralised platforms. Read the method:
https://github.com/rayonlabs/G.O.D/blob/main/docs/Gradient_White_Paper.pdf
@gradients_ai obliterates the field for both text and image experiments.
GCP charges $10k dollars for one 70b model.... Gradients < $50.
Gradients still beats GCP. every. single. time.
All we can do is stand and bow in awe to the power of decentralised incentive mechanisms.
Decentralized compute is winning. We don't have one datacenter, we have dozens. We don't have one SRE team, we have nearly 100.
Latest example: DeepSeek-R1-0528. 100% uptime, day zero support, 4x more tokens on openrouter than all other providers combined (and go check the…
Rayon Labs have now transferred approximately 1.25k TAO to miners across Subnet 64, Subnet 56, and Subnet 19 that were providing services for millions of requests during the chain pause last week. (The individual miner values were calculated using the data from the subnet…
Rayon Labs will be redistributing a portion of our existing owner emissions, in one way or another, to help compensate miners on 19, 56 and 64 through the Bittensor chain pause. We will use data from the auditing logs to calculate the incentives during the downturn.
A big thank…
DeepSeek R1 0528, Live Now on Chutes
The latest version of DeepSeek R1 is available now for free on http://chutes.ai
https://chutes.ai/app/chute/14a91d88-d6d6-5046-aaf4-eb3ad96b7247?tab=stats
#subnet64 #chutes
@chutes_ai
Keep ahead of the Bittensor exponential development curve…
Subnet Alpha is an informational platform for Bittensor Subnets.
This site is not affiliated with the Opentensor Foundation or TaoStats.
The content provided on this website is for informational purposes only. We make no guarantees regarding the accuracy or currency of the information at any given time.
Subnet Alpha is created and maintained by The Realistic Trader. If you have any suggestions or encounter any issues, please contact us at [email protected].
Copyright 2024