How to “sell” the OSS Framework? Propan -> FastStream

Hello Habr! Perhaps someone even remembers me: just recently I created the Python Framework for asynchronous applications – Propan.

Two months ago I shared my progress with you and said that we were working on PropanV2 is actively underway. And now, it’s completed. But instead of PropanV2 A slightly different framework saw the light – FastStreamwhich was developed not only by me, but by a whole team AirtAI.

As part of the article, I will tell you how the projects differ, how to move from one to another, and of course, how this happened in the first place. If you are interested in how I “sold” the opensource project, and how everyone benefited from it (especially the users) – welcome to the cat!

Very long preface

What is Propan?

A short excursion into recent history…

Once upon a time, when I was actively working with RabbitMQ, I kept wondering why such a powerful tool wasn’t conveniently instrumented in the Python ecosystem. For HTTP-We have everything, but for asynchronous services we have only “naked” clients. This is how the idea of ​​creating a convenient framework for working with RabbitMQ.

This idea was quickly embodied in a small private library for internal needs, which made it possible to quickly raise similar services around RabbitMQ.

However, six months ago, when working with Kafka (out of great need), I realized that the problem is not around RabbitMQ, and around any message brokers/queues/streams, etc. And then I started working on the project in earnest: it was decided to make support based on it ANY brokers (and somewhere brokerless pub/sub) and bring it all to Open Source.

So, in May of this year the first open release came out Propanwhich allows you to declare your asynchronous consumers in fairly declarative syntax:

from propan import PropanApp, RabbitBroker

broker = RabbitBroker()
app = PropanApp(broker)

@broker.handle("test-queue")
async def handler(user: str, user_id: int):
    print(f"User {user} - {user_id} created")

Where did AirtAI come from?

Naturally, while I was working on the project, I monitored what competitors existed. At the time of starting work on a private project, there were none. At the time of the start of work on opensource, there were, but mostly the same “crafts” of enthusiasts in the alpha version.

Of the large ones – only FastKafka. This is a framework from AirtAI with functionality extremely close to what I wanted, but only for Kafka. But their backlog included the task of adding support for other brokers. And at that time they already had 400+ stars on GitHub.

So why didn’t I wait for support to be added, but decided to do my own?

There were a number of reasons for this:

  • AirtAI – a company of data scientists about data science. Although the functionality of their project was close to what I wanted, the interfaces left much to be desired

  • The package itself was assembled from Jupiter notebooks. This is “cool”, but makes it very difficult for “community” edits

  • Their code base was not ready to support other brokers

  • The history of commits indicated that people were stuck on bugfixes and new functionality would not be expected very soon

  • Although the project had many stars and support from various developers, actual downloads were minimal

In general, the presence of such a project (and some others) indicated that there was a need for a tool, but the toolkit itself was in its infancy. Need to do!

Why did we unite

Since the first release Propan, I was quickly able to attract at least some attention from developers to it. Stars poured in for gits, downloads, and most importantly – feedback.

And quite quickly I realized that OSS is great and fun, but supporting a project of such ambitions, increasing functionality, writing documentation, promoting, interacting with the community, etc., is too much for one person.

Therefore, already in early August, when the project gained a certain “significant” number of stars on GitHub, and surpassed several times in downloads FastKafkaI decided to write AirtAI and invite them not to compete, but to join our efforts.

I received the agreement to cooperate within 5 minutes.

How it works

After merging with AirtAI we took up Documentation Driven Development: We argued for several weeks and drew up a specification for the project that should be the result. Something came from Propansomething of FastKafka (both deprecated now). Generally, after liters of blood spilled in disputes, everyone was happy.

New framework FastStream comes out under the auspices of the team AirtAI, and I am still the lead developer on the project. I was forced to refuse the offer, so we developed a very interesting format of cooperation, the details of which I would not like to go into. However, this is absolutely a win-2-win-2-win situation.

WIN: I’m happy because… the project is now moving forward in a way that I could not: release on Infobip Shiftpresentation at PyCon Praguenews press releases, etc., as well as extra hands that work with me on what I love.

WIN: AirtAI got a strong product for their technical PR.

WIN: Users get a whole company to support the tool, which is insurance against my burnout and leaving the project.

What exactly is new?

And a lot of things! But first of all, I must say that FastStream built on the basis Propan with additional features from FastKafka. That’s why almost all Propan the code (after renaming and adjusting imports) works correctly in FastStream (migration guide)

Main innovations – addition publisher‘s and extended functionality for testing.

Publisher – a completely new entity that allows you to conveniently build data processing pipelines: you simply declare where to receive data from and where to send the result, your code only processes it.

from faststream import FastStream
from faststream.kafka import KafkaBroker

broker = KafkaBroker()
app = FastStream(broker)

@broker.subscriber("in-topic")
@broker.publisher("out-topic")
async def process_data(msg: str) -> str:
    return f"{msg=} processed"

Testability is a very important aspect in FastStream. In-Memory testing has already been done Propanhowever in FastStream this concept was significantly expanded by adding tools for end-2-end testing. Now you can validate both incoming and outgoing messages, and switch your tests to use a real broker with just one parameter.

import pytest
from faststream.kafka import TestKafkaBroker

@pytest.mark.asyncio
async def test_process_data():
    async with TestKafkaBroker(broker, with_real=False) as br:
        await br.publish("Hello!", topic="in-topic")

        process_data.mock.assert_called_once_with("Hello!")

In addition, a bunch of new little things have been added: Middlewares, more powerful Routers, overloading of subscribers, improved logging customization and some other things that have a positive effect on the user experience.

Where to go?

The project is now actively developing: we work closely with companies and people who have already adopted FastStream and process feedback. At this moment FastStream supports work with RabbitMQ, Kafka And NATS. Support is expected to move in the near future Redis Pub/Sub And SQS from Propanas well as support Redis Streamswhich we are very much asked for.

Since the project now has two visionaries, all changes go through approvals on both sides, which allows us to significantly improve the quality of the product, polishing each idea before its implementation (Documentation Driven Development, remember?). All of these discussions are happening right on GitHub, so anyone interested in Open Source can also participate.

For example, we are now actively we argue around how to name entities “correctly” in AsyncAPI specifications that are generated automatically.

To be completely frank, then FastStream – a very advanced, convenient and unified client for various message brokers. But this is just a client, on the basis of which you can build excellent solutions of an even higher level (for example, the next killer Celery or a distributed data processing platform in which services are generated from a config). The same AirtAI developed CLI utility on top of ChatGPT, which generates ready-made FastStream services according to your description. Let’s see what other new projects the way will open FastStream.

Instead of output

FastStream – an example of the unique relationships that can be built in the OpenSource world. I just wanted to share this case study with you and hopefully it will inspire someone to take up their Pet project and develop it into something bigger (mine was sitting on the shelf for several years).

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *