August Network Update

Opentensor Foundation
3 min readAug 7, 2023

--

In our commitment to keep you well-informed about our progress, we will begin providing more regular Bittensor network updates. We will also be hosting a TGIFT Twitter Spaces on Thursday, August 10 to discuss the below in more detail and to answer any questions you have.

Recent Milestones for the Bittensor Network:

BTLM Model Collaboration with Cerebras:

At Bittensor, we aim to remain at the forefront of AI research and development. To advance this vision, we established a strategic partnership with Cerebras, a company renowned for its innovative hardware and software solutions that accelerate AI capabilities. Together, our aim was to train a 3B model capable of achieving two pivotal goals: 1) attaining state-of-the-art accuracy in its class and 2) enabling seamless inference with notably long sequence lengths¹.

We are proud to announce the successful collaboration and the creation of our State-of-the-Art (SOTA) 3B model, known as BTLM.

The model has been performing very well on the Bittensor network, and we will be sharing more data in the coming days. We look forward to our continued partnership with Cerebras as we continue to push the boundaries of cutting-edge AI.

You can read more about the BTLM model here.

First Ever Incentivized Pre-Training:

The Significance of Pre-Training: Pre-training is a crucial step in the AI model development process, involving the encoding of knowledge into models prior to their task-specific training. It is a computationally demanding challenge, often necessitating tens of thousands of GPUs to train models like GPT-3 or Claude. As models become increasingly sophisticated, pre-training costs escalate, hence OpenAI’s efforts to raise an additional $100B in external capital². The ability to incentivize GPUs from around the world has the potential to revolutionize the way models are trained.

Our Achievement: We are proud to announce the successful execution of the first-ever incentivized pre-training using GPT-2 on our testnet. This accomplishment marks a significant step towards providing open-source AI access with computational resources comparable to those of industry giants. Our endeavor to democratize access to such powerful resources is a crucial contribution to fostering more inclusive participation within the AI/ML community.

Looking Ahead:

As we continue to progress, we have garnered increasing attention from top AI/ML talent and we look forward to making the network more accessible for all. We are unwavering in our commitment to cultivating a network enriched with a diverse and proficient community of contributors, recognizing that high-quality data plays a pivotal role in our mission to make AI accessible to all.

We extend our deepest gratitude to our dedicated community whose unwavering support and enthusiasm fuel our progress. We are excited about the journey ahead and steadfast in our commitment to transparency, regular updates, and collective collaboration towards shaping the future of AI.

(1) 8,192 token context window

(2) The Information, May 4, 2023

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

No responses yet

Write a response