black-forest-labs/flux
FLUX
by Black Forest Labs: https://bfl.ai.
Documentation for our API can be found here: docs.bfl.ai.
This repo contains minimal inference code to run image generation & editing with our Flux open-weight models.
Local installation
|
|
Local installation with TensorRT support
If you would like to install the repository with TensorRT support, you currently need to install a PyTorch image from NVIDIA instead. First install enroot, next follow the steps below:
|
|
Open-weight models
We are offering an extensive suite of open-weight models. For more information about the individual models, please refer to the link under Usage.
The weights of the autoencoder are also released under apache-2.0 and can be found in the HuggingFace repos above.
API usage
Our API offers access to all models including our Pro tier non-open weight models. Check out our API documentation docs.bfl.ai to learn more.
Licensing models for commercial use
You can license our models for commercial use here: https://bfl.ai/pricing/licensing
As the fee is based on a monthly usage, we provide code to automatically track your usage via the BFL API. To enable usage tracking please select track_usage in the cli or click the corresponding checkmark in our provided demos.
Example: Using FLUX.1 Kontext with usage tracking
We provide a reference implementation for running FLUX.1 with usage tracking enabled for commercial licensing. This can be customized as needed as long as the usage reporting is accurate.
For the reporting logic to work you will need to set your API key as an environment variable before running:
|
|
You can call FLUX.1 Kontext [dev]
like this with tracking activated:
|
|
For a single generation:
|
|
The above reporting logic works similarly for FLUX.1 [dev] and FLUX.1 Tools [dev].
Note that this is only required when using one or more of our open weights models commercially. More information on the commercial licensing can be found at the BFL Helpdesk.
Citation
If you find the provided code or models useful for your research, consider citing them as:
|
|