Is this project really apache or not?
So a merger had uploaded chroma AIO weights on HF. It was nice and convenient since it had realism/accelerator lora merged in. Silveroxides had some conniption because people weren't doing it manually and enjoying themselves instead. I caught a little bit of the discussion but the next day, everything is purged. I could swear others had GGUF of it too and now it's not even findable on HF.
What's up with that? I thought the project was apache and could be used and expanded upon freely.
The manually thing may have to do with the scaled fp8, once you mix things together and save a new model, you don't retain 'the same exact scaled quality' unless you create a new scaled model, although you might want it to be based on the original fp16 model. There was also the thing about AIO checkpoint and UNET only that was skimmed in the discussion, this is probably what you are referring to, but it really was just skimmed.
As for the license, The 2k-test lora the AIO was using (https://huggingface.co/silveroxides/Chroma-LoRAs) is under Creative Commons Attribution Non Commercial Share Alike 4.0, because parts of the training data used to train it is under a more restrictive licence then what Chroma is under. Chroma itself is 'fully Apache 2.0 licensed'.
edit: added a bit more detail and clarification.
I don't recall the maker using it commercially so they would only have had to add some attribution. The trick with scaled fp8 is just to dequantize it to BF16. To me this is a strange bit of friendly fire and infighting.