Two Case Studies on Midjourney’s Consistency in Character Imaging
Last Updated on March 14, 2024 by Editorial Team
Author(s): Meng Li
Originally published on Towards AI.
Created by Meng Li
Hey friends, have you ever pondered this question?
When you’re creating a movie or writing a novel, you always want the same character to appear in different scenes and backgrounds, right?
However, a challenge arises: how can you ensure that the character’s appearance remains consistent across all scenes?
The good news is that Midjourney has recently launched a new algorithm.
Whether you’re working with the MJ6 or Niji6 model, this algorithm ensures the “consistency” of your character images.
Simply put, no matter in which scene your character appears, their appearance will stay consistent.
Sounds quite practical, doesn’t it? But how exactly does one utilize it?
I plan to test this using two character types: a cat and a girl, to evaluate the actual effectiveness of this algorithm.
Midjourney has launched a new tag called “cref,” which stands for “character reference.”
You just need to add this tag along with a relevant URL link after your prompt, and Midjourney will match the character features from that link.
This feature sounds quite convenient, but I’m a bit curious about how accurate it can really be.
Could it still end up like before, where, although it looks similar, something feels off?
Additionally, I’ve discovered a new control parameter called “ cw.”
This parameter… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.
Published via Towards AI