References
Updated: 2026-05
1. About This Page
This guide covers how to use Runway’s References and Motion Brush to convey instructions that are difficult to express in text using images, videos, and drawings.
2. Why “References”?
Text prompts have their limitations:
- It’s difficult to accurately convey “this color,” “this texture,” and “this composition” in words
- Even if you describe “a specific person’s face,” you can’t reproduce it
- Describing “this way of moving” in writing becomes overly wordy
Runway can handle this by simply dragging and dropping image or video references.
3. Types of References
| Type | Input | Purpose |
|---|---|---|
| Image Reference | 1–several images | Reference for composition, color, and atmosphere |
| Style Reference | 1 image | Specifies style (illustration-style, watercolor-style, etc.) |
| Character Reference | Character image | Ensures consistency in the character’s face and clothing |
| Video Reference | Video | Reference for movement |
| First / Last Frame | 2 images | Specify the first and last frames to interpolate the in-between |
The class will focus primarily on Image Reference and Character Reference.
4. How to Use Image Reference
4.1 Basic Workflow
- Drag and drop an image into the reference area on the Generate screen
- That image will serve as a reference for “mood, color, and composition”
- Write instructions focused on “movement” in the prompt
Example: Use a photo of a city at dusk as a reference, and the prompt is:
A young woman walks down the street in casual clothes. The camera follows her from behind.
The images set the mood, and the prompts drive the action—that’s how we’re dividing up the roles.
4.2 Adjusting the Strength of References
Some models (Gen-4.5) include an impact slider in their reference settings. Set it low for a rough guide, or high for a faithful representation.
- 0–30%: Just borrowing the atmosphere
- 50%: Balanced
- 80–100%: Quite faithful, even down to the composition
5. Character Reference
A lifesaver when reusing the same character across multiple shots. It comes in handy for Act-Two and multi-shot production.
5.1 How to Use
- Prepare images that clearly show the character’s face and clothing (ideally facing forward, well-lit, and with a plain background)
- Upload them to the Character Reference area
- Enable this reference when generating each shot
- Use the prompt to specify the character’s movements
5.2 Precautions
- Images taken facing forward, against a plain background, and with bright lighting yield the highest success rates
- If a character’s clothing or hairstyle is complex, consistency is easily lost
- Reference images showing a side profile or at an angle result in lower accuracy
This will be covered in more detail in the next section, “Character Consistency.”
6. Video Reference
Use this as a reference for your movements.
Example: Upload a “walking video” you recorded yourself on your smartphone as a reference, and instruct the AI to “generate using this movement.”
Reference: A video of me walking
Prompt: A robot walks the same way through a futuristic city.You can use it with the same intuitive feel as “copying movements and changing appearances.” Some of its features overlap with Aleph, so you can choose which one to use depending on your needs (Aleph will be covered in the following pages).
7. First / Last Frame
A technique in which the “beginning and end” are fixed as images, and the AI fills in the gaps.
Great for turning storyboards into videos:
- Prepare an image of the starting frame (e.g., the door is closed)
- Prepare an image of the ending frame (e.g., the door is open and a person is visible)
- Runway generates the motion between those two frames
You can turn storyboard keyframes directly into video.
8. Motion Brush
A feature in I2V that allows you to use a paint tool to specify which part of the image to move. Available in the Gen-4 series.
8.1 Basic Workflow
- Upload the input image
- Launch Motion Brush
- Use the brush to paint the “area you want to animate”
- Specify the direction of movement for that area using arrows (or select “still” to keep it static)
- You can assign different movements to different areas
- Generate the image by combining this with a prompt
8.2 Examples
- In landscape photos, only the clouds move to the left
- In portraits, only the hair sways
- In cityscapes, only the cars move toward the foreground
This allows for an effect where “the background remains static while specific elements move.” It is well-suited for cinemagraph-style presentations.
8.3 Motion Brush 2.0
As of 2026, it has evolved into Motion Brush 2.0, allowing each area to move independently. Settings that move one area while keeping another fixed now work reliably.
9. Combining References
You can use multiple references at the same time:
- Image Reference: Cityscape (atmosphere)
- Character Reference: Character image (appearance)
- Motion Brush: Only the clouds move
“With ‘Atmosphere from Image A, Characters from Image B, and Movement from Paint,’ multi-layered control is possible.”
However, using all of them at once can confuse the AI, so start with just one or two to see how they work.
10. Credit Usage in “References”
Using references generally doesn’t result in extra credits being deducted. It depends on the generative model and clip length.
However, Aleph and Act-Two operate on a reference basis, so their consumption rates are different. These will be covered in the following pages.
11. What’s Next
- Shot Planning — From Storyboards to Shot Design
- Character Consistency — Bringing Characters to Life in Act Two
