Earn 7.17% APY staking with Solana Compass + help grow Solana's ecosystem

Stake natively or with our LST compassSOL to earn a market leading APY

Scale or Die at Accelerate 2025: Indexing Solana programs with Carbon

By accelerate-25

Published on 2025-05-19

Carbon, a new Rust framework for indexing Solana data, simplifies developer tooling and streamlines the process of creating indexers for Solana programs.

The notes below are AI generated and may not be 100% accurate. Watch the video to be sure!

In a groundbreaking presentation at Accelerate 2025, Kellian Vitre from Sevenlabs introduces Carbon, a revolutionary Rust framework set to transform the landscape of Solana development. This innovative tool promises to simplify the complex process of indexing Solana programs, potentially saving developers countless hours and accelerating the growth of the Solana ecosystem.

Summary

Carbon is a modular Rust framework designed to streamline the process of indexing Solana data. Developed by Sevenlabs, a Dev Shop specializing in Solana, Carbon addresses the recurring challenge of rewriting code for decoding data, sourcing it, and processing it when building indexers for Solana programs.

The framework's primary goal is to allow developers to focus on delivering value to their users rather than repeatedly writing boilerplate code. Carbon achieves this by providing a simple, plug-and-play solution that connects RPC, WebSocket, or gRPC data sources with IDLs (Interface Description Language) to output Postgres tables containing all relevant accounts.

Carbon's architecture is built around three main components: data sources, decoders, and processors. Each of these components is implemented as a trait, allowing for easy customization and extension. The framework comes with pre-built modules for common use cases, significantly reducing the amount of custom code developers need to write.

One of the most exciting aspects of Carbon is its extensibility. The team has already published seven data source crates and 40 decoder crates, with more on the way. For custom programs, Carbon provides a CLI tool that can generate decoder code from any Anchor IDL, further simplifying the development process.

Key Points:

Carbon's Core Components

Carbon's architecture revolves around three primary components: data sources, decoders, and processors. Data sources are structs that can be consumed for account or instruction updates. The framework currently supports seven pre-built data sources, including various RPC and WebSocket methods, Helios and Hands WebSockets, Yellowstone gRPC, and the recently contributed Shred Stream gRPC.

Decoders are responsible for interpreting the raw data from the data sources. Carbon offers 40 pre-built decoders, covering a wide range of common use cases. For custom programs, developers can use the Carbon CLI to generate decoder code from any Anchor IDL, ensuring compatibility with virtually any Solana program.

Processors form the custom logic layer of Carbon. This is where developers implement their specific requirements, such as sending data to Kafka, storing accounts in Postgres, or triggering events based on on-chain activities. The flexibility of the processor component allows developers to tailor Carbon to their exact needs.

Simplifying Solana Development

One of Carbon's most significant advantages is its ability to dramatically simplify the development process for Solana indexers. In the demonstration provided during the presentation, Kellian showcased an example that required less than 150 lines of code to create a fully functional SPL token program account indexer using real-time updates.

This example utilized the Helios laser stream as a data source, set up a Postgres client and GraphQL server, and implemented a processor to save token and mint accounts to a database. The entire process, from setting up the data pipeline to querying the indexed data through GraphQL, was seamless and efficient.

The simplicity and efficiency demonstrated in this example highlight Carbon's potential to revolutionize Solana development. By reducing the amount of boilerplate code and providing ready-to-use components, Carbon enables developers to focus on building unique features and delivering value to their users, rather than getting bogged down in the intricacies of indexing implementation.

Future Developments and Roadmap

The Carbon team has an ambitious roadmap for future developments. One of the most exciting upcoming features is a Kodama renderer that will support any Kodama IDL (including Anchor and Shank). This renderer will not only generate the Borsh decoder (as the current CLI does) but will also optionally generate Postgres schemas and GraphQL schemas. This means developers will be able to generate their entire program's relayer from a single IDL and data source choice using the Carbon CLI.

Other items on the roadmap include launching a comprehensive documentation website, extensive testing of all decoders with fixtures, and developing historical data sources for backfilling current states. The team is also working on adding support for account snapshots as a data source, which will allow indexers to start with an initial account state.

These planned features demonstrate Carbon's commitment to continually improving and expanding its capabilities, ensuring that it remains at the forefront of Solana development tooling.

Facts + Figures

  • Carbon is a modular Rust framework for indexing Solana data, developed over the past 8 months.
  • The framework provides 7 pre-built data source crates for different RPC and WebSocket methods.
  • Carbon offers 40 pre-built decoder crates for common use cases.
  • The example demonstrated required less than 150 lines of code to create a fully functional SPL token program account indexer.
  • In the demonstration, the indexer processed 26,000 Yellowstone updates in its first 10 seconds of operation.
  • After 1 minute and 30 seconds, the demo indexer had indexed 750 token accounts and 212 mint accounts.
  • Carbon recently crossed 200 stars on GitHub, indicating growing interest from the developer community.
  • The Carbon CLI can generate decoder code from any Anchor IDL for custom programs.
  • The upcoming Kodama renderer will support Anchor, Shank, and Kodama IDLs.
  • The Carbon team is actively seeking contributors and open to pull requests on GitHub.

Top quotes

  1. "Our premise was simple, and I think everyone has experienced this as an app dev, you just want your accounts in a database."
  1. "We don't want to be spending time rewriting the same pieces of code over and over again, and at the end of the day, that's what we're tooling is for."
  1. "We want devs to focus on delivering value to their users, not on rewriting everything over and over again."
  1. "Carbon is indexing framework on Solana, so it basically helps you to create a source, decode and process the account and instruction data from your program."
  1. "You can choose your data source, use the CLI, generate your decoder, and you're already good to go with a stream of account and instruction updates."

Questions Answered

What is Carbon and why is it important for Solana developers?

Carbon is a modular Rust framework designed for indexing Solana data. It's important for Solana developers because it simplifies the process of creating indexers for Solana programs. Carbon encapsulates common indexing tasks like decoding data, sourcing it, and processing it, allowing developers to focus on delivering value to their users rather than rewriting boilerplate code repeatedly.

How does Carbon work?

Carbon works by providing three main components: data sources, decoders, and processors. Data sources are structs that can be consumed for account or instruction updates. Decoders interpret the raw data from the data sources. Processors form the custom logic layer where developers implement their specific requirements. These components are implemented as traits, allowing for easy customization and extension.

What are the key features of Carbon?

Carbon's key features include pre-built modules for common use cases, a CLI tool for generating decoder code from Anchor IDLs, support for multiple data sources including RPC and WebSocket methods, and a flexible architecture that allows for easy customization. It also offers a simple plug-and-play solution that connects data sources with IDLs to output Postgres tables containing all relevant accounts.

What improvements does Carbon bring to Solana development?

Carbon significantly reduces the amount of code developers need to write for indexing Solana programs. In the demonstration, a fully functional SPL token program account indexer was created with less than 150 lines of code. This efficiency allows developers to focus more on building unique features and less on implementing indexing logic, potentially accelerating the development of Solana applications.

What are the future plans for Carbon?

The Carbon team has several exciting developments planned. These include a Kodama renderer that will generate Postgres and GraphQL schemas from IDLs, a comprehensive documentation website, extensive testing of decoders, development of historical data sources for backfilling, and support for account snapshots as a data source. These features aim to further simplify and enhance the Solana development process.

How can developers start using Carbon?

Developers can start using Carbon by accessing the framework on GitHub, where it recently crossed 200 stars. The team provides pre-built data source and decoder crates that can be easily integrated into projects. For custom programs, developers can use the Carbon CLI to generate decoder code from Anchor IDLs. The team also encourages contributions and is open to pull requests from the community.

What types of data sources does Carbon support?

Carbon supports a variety of data sources, including multiple RPC and WebSocket methods. Specifically, it supports Program Subscribe, Blog Subscribe, Helios and Hands WebSockets, Yellowstone gRPC, and the recently added Shred Stream gRPC. This wide range of support allows developers to choose the most suitable data source for their specific needs.

How does Carbon handle custom Solana programs?

For custom Solana programs, Carbon provides a CLI tool that can generate decoder code from any Anchor IDL. This means that even if a program is not covered by the 40 pre-built decoders, developers can easily create a compatible decoder for their specific program, ensuring that Carbon can work with virtually any Solana program.


Related Content

Scale or Die 2025: No-strings-attached programs w/ Pinocchio

Fernando Otero introduces Pinocchio, a new dependency-free SDK for writing efficient Solana programs

Scale or Die at Accelerate 2025: Solver Infrastructure

RockawayX Labs' Krystof Kosina discusses the challenges and solutions in developing cross-chain solvers on Solana

Ship or Die at Accelerate 2025: Time Is Money (Kawz - Time.fun)

Kawz introduces Time.fun, a platform that tokenizes time and creates new capital markets on Solana

Ship or Die at Accelerate 2025: Lightning Talk: SendAI

SendAI introduces Solana App Kit, revolutionizing mobile app development on Solana

Scale or Die at Accelerate 2025: Welcome to Scale or Die: Day 2

Day 2 of Scale or Die event focuses on infrastructure and dev tooling with workshops and summits

Ship or Die at Accelerate 2025: Advancing Solana DeFi Innovation

OKX announces major developments for Solana, including XBTC integration and increased wallet usage

Ship or Die at Accelerate 2025: Lightning Talk: MetaMask

MetaMask announces native Solana support and multi-chain wallet experience

Scale or Die at Accelerate 2025: Decompiling Solana Programs

Robert Chen from Ottersec unveils groundbreaking tools for decompiling Solana programs, enhancing transparency and security in the ecosystem.

Scale or Die at Accelerate 2025: Writing Optimized Solana Programs

Dean Little from Blueshift delivers an in-depth exploration of Solana program optimization techniques at Accelerate 2025.

Scale or Die at Accelerate 2025: Fireside: zkSVMs

Industry experts discuss the potential of zkSVMs and rollups for scaling Solana and improving DeFi applications

Scale or Die at Accelerate 2025: Messari x Solana Dev

Messari's Diran Li shares insights on building data-driven applications on Solana, focusing on data curation, AI integration, and scalable solutions.

Ship or Die at Accelerate 2025: Hello and Welcome

Solana hosts its first major US conference, focusing on policy, product development, and the future of crypto in America.

Scale or Die at Accelerate 2025: Building The On-Chain NASDAQ

Bullet aims to revolutionize financial markets with a high-performance, on-chain derivatives exchange on Solana

Ship or Die at Accelerate 2025: Lightning Talk: GEODNET

Mike Horton introduces GEODNET, a decentralized physical infrastructure network for precise positioning of robots and drones

Scale or Die at Accelerate 2025: The State of Solana MEV

An in-depth look at MEV on Solana, focusing on sandwich attacks and their impact on the ecosystem