Generate text using a language model.

prompt
string

Input prompt.

temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

Default: 0.4
max_tokens
integer
Optional

Maximum number of tokens to generate.

node
string
Optional

Selected node.

Options: Mistral7BInstructMixtral8x7BInstructLlama3Instruct8BLlama3Instruct70B
Default: Mistral7BInstruct
TypeScript
Python

const node = new GenerateText({
prompt: "Who is Don Quixote?",
temperature: 0.4,
max_tokens: 800,
})

Output

{
"text": "Don Quixote is a fictional character in the novel of the same name by Miguel de Cervantes."
}

Generate multiple text choices using a language model.

prompt
string

Input prompt.

num_choices
integer[1..8]

Number of choices to generate.

Default: 1
temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

Default: 0.4
max_tokens
integer
Optional

Maximum number of tokens to generate.

node
string
Optional

Selected node.

Options: Mistral7BInstructMixtral8x7BInstructLlama3Instruct8BLlama3Instruct70B
Default: Mistral7BInstruct
TypeScript
Python

const node = new MultiGenerateText({
prompt: "Who is Don Quixote?",
num_choices: 2,
max_tokens: 800,
})

Output

{
"choices": [
{
"text": "Don Quixote is a fictional character and the protagonist of the novel Don Quixote by Miguel..."
},
{
"text": "Don Quixote is a fictional character created by the Spanish author Miguel de Cervantes..."
}
]
}

Generate text for multiple prompts in batch using a language model.

prompts
array[string]

Batch input prompts.

temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

Default: 0.4
max_tokens
integer
Optional

Maximum number of tokens to generate.

TypeScript
Python

const node = new BatchGenerateText({
prompts: [
"Who is Don Quixote?",
"Who is Sancho Panza?"
,],
max_tokens: 800,
})

Output

{
"outputs": [
{
"text": "Don Quixote is a fictional character and the protagonist of the novel Don Quixote by Miguel..."
},
{
"text": "Don Quixote is a fictional character created by the Spanish author Miguel de Cervantes..."
}
]
}

Generate JSON using a language model.

prompt
string

Input prompt.

JSON schema to guide json_object response.

temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

Default: 0.4
max_tokens
integer
Optional

Maximum number of tokens to generate.

node
string
Optional

Selected node.

Options: Mistral7BInstructMixtral8x7BInstruct
Default: Mistral7BInstruct
TypeScript
Python

const node = new GenerateJSON({
prompt: "Who wrote Don Quixote?",
json_schema: {
type: "object",
properties: {
name: {
type: "string",
description: "The name of the author.",
},
bio: {
type: "string",
description: "Concise biography of the author.",
},
},
},
temperature: 0.4,
max_tokens: 800,
})

Output

{
"json_object": {}
}

Generate multiple JSON choices using a language model.

prompt
string

Input prompt.

JSON schema to guide json_object response.

num_choices
integer[1..8]

Number of choices to generate.

Default: 2
temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

Default: 0.4
max_tokens
integer
Optional

Maximum number of tokens to generate.

node
string
Optional

Selected node.

Options: Mistral7BInstructMixtral8x7BInstruct
Default: Mistral7BInstruct
TypeScript
Python

const node = new MultiGenerateJSON({
prompt: "Who wrote Don Quixote?",
json_schema: {
type: "object",
properties: {
name: {
type: "string",
description: "The name of the author.",
},
bio: {
type: "string",
description: "Concise biography of the author.",
},
},
},
num_choices: 2,
temperature: 0.4,
max_tokens: 800,
})

Output

{
"choices": [
{
"json_object": {}
},
{
"json_object": {}
}
]
}

Generate JSON for multiple prompts in batch using a language model.

prompts
array[string]

Batch input prompts.

JSON schema to guide json_object response.

temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

Default: 0.4
max_tokens
integer
Optional

Maximum number of tokens to generate.

TypeScript
Python

const node = new BatchGenerateJSON({
prompts: [
"Who is Don Quixote?",
"Who is Sancho Panza?"
,],
max_tokens: 800,
json_schema: {
type: "object",
properties: {
name: {
type: "string",
description: "The name of the character.",
},
bio: {
type: "string",
description: "Concise biography of the character.",
},
},
},
})

Output

{
"outputs": [
{
"json_object": {}
},
{
"json_object": {}
}
]
}

Generate text with image input.

prompt
string

Text prompt.

image_uris
array[string]

Image prompts.

max_tokens
integer
Optional

Maximum number of tokens to generate.

Default: 800
TypeScript
Python

const node = new GenerateTextVision({
prompt: "what are these paintings of and who made them?",
image_uris: [
"https://media.substrate.run/docs-fuji-red.jpg",
"https://media.substrate.run/docs-fuji-blue.jpg"
,],
})

Output

{
"text": "The artist who created these paintings is Hokusai Katsushika, a renowned Japanese artist known for his woodblock prints and paintings."
}

Generate text using Mistral 7B Instruct.

prompt
string

Input prompt.

num_choices
integer[1..8]
Optional

Number of choices to generate.

Default: 1
json_schema
object
Optional

JSON schema to guide response.

temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

max_tokens
integer
Optional

Maximum number of tokens to generate.

TypeScript
Python

const node = new Mistral7BInstruct({
prompt: "Who is Don Quixote?",
num_choices: 2,
temperature: 0.4,
max_tokens: 800,
})

Output

{
"choices": [
{
"text": "Don Quixote is a fictional character and the protagonist of the novel Don Quixote by Miguel..."
},
{
"text": "Don Quixote is a fictional character created by the Spanish author Miguel de Cervantes..."
}
]
}

Generate text using instruct-tuned Mixtral 8x7B.

prompt
string

Input prompt.

num_choices
integer[1..8]
Optional

Number of choices to generate.

Default: 1
json_schema
object
Optional

JSON schema to guide response.

temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

max_tokens
integer
Optional

Maximum number of tokens to generate.

TypeScript
Python

const node = new Mixtral8x7BInstruct({
prompt: "Who is Don Quixote?",
num_choices: 2,
temperature: 0.4,
max_tokens: 800,
})

Output

{
"choices": [
{
"text": "Don Quixote is a fictional character and the protagonist of the novel Don Quixote by Miguel..."
},
{
"text": "Don Quixote is a fictional character created by the Spanish author Miguel de Cervantes..."
}
]
}

Generate text using instruct-tuned Llama 3 8B.

prompt
string

Input prompt.

num_choices
integer[1..8]
Optional

Number of choices to generate.

Default: 1
temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

max_tokens
integer
Optional

Maximum number of tokens to generate.

TypeScript
Python

const node = new Llama3Instruct8B({
prompt: "Who is Don Quixote?",
num_choices: 2,
temperature: 0.4,
max_tokens: 800,
})

Output

{
"choices": [
{
"text": "Don Quixote is a fictional character and the protagonist of the novel Don Quixote by Miguel..."
},
{
"text": "Don Quixote is a fictional character created by the Spanish author Miguel de Cervantes..."
}
]
}

Generate text using instruct-tuned Llama 3 70B.

prompt
string

Input prompt.

num_choices
integer[1..8]
Optional

Number of choices to generate.

Default: 1
temperature
float[0..1]
Optional

Sampling temperature to use. Higher values make the output more random, lower values make the output more deterministic.

max_tokens
integer
Optional

Maximum number of tokens to generate.

TypeScript
Python

const node = new Llama3Instruct70B({
prompt: "Who is Don Quixote?",
num_choices: 2,
temperature: 0.4,
max_tokens: 800,
})

Output

{
"choices": [
{
"text": "Don Quixote is a fictional character and the protagonist of the novel Don Quixote by Miguel..."
},
{
"text": "Don Quixote is a fictional character created by the Spanish author Miguel de Cervantes..."
}
]
}

Generate text with image input using FireLLaVA 13B.

prompt
string

Text prompt.

image_uris
array[string]

Image prompts.

max_tokens
integer
Optional

Maximum number of tokens to generate.

Default: 800
TypeScript
Python

const node = new Firellava13B({
prompt: "what are these paintings of and who made them?",
image_uris: [
"https://media.substrate.run/docs-fuji-red.jpg",
"https://media.substrate.run/docs-fuji-blue.jpg"
,],
})

Output

{
"text": "The artist who created these paintings is Hokusai Katsushika, a renowned Japanese artist known for his woodblock prints and paintings."
}

Generate an image.

prompt
string

Text prompt.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new GenerateImage({
prompt: "hokusai futuristic supercell spiral cloud with glowing core over turbulent ocean",
store: "hosted",
})

Output

{
"image_uri": "https://assets.substrate.run/84848484.jpg"
}

Generate multiple images.

prompt
string

Text prompt.

num_images
integer[1..8]

Number of images to generate.

Default: 2
store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new MultiGenerateImage({
prompt: "hokusai futuristic supercell spiral cloud with glowing core over turbulent ocean",
num_images: 2,
store: "hosted",
})

Output

{
"outputs": [
{
"image_uri": "https://assets.substrate.run/84848484.jpg"
},
{
"image_uri": "https://assets.substrate.run/48484848.jpg"
}
]
}

Edit an image using image generation.

image_uri
string

Original image.

prompt
string

Text prompt.

mask_image_uri
string
Optional

Mask image that controls which pixels are inpainted. If unset, the entire image is edited (image-to-image).

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new GenerativeEditImage({
image_uri: "https://media.substrate.run/docs-klimt-park.jpg",
mask_image_uri: "https://media.substrate.run/spiral-logo.jpeg",
prompt: "large tropical colorful bright anime birds in a dark jungle full of vines, high resolution",
store: "hosted",
})

Output

{
"image_uri": "https://assets.substrate.run/84848484.jpg"
}

Edit multiple images using image generation.

image_uri
string

Original image.

prompt
string

Text prompt.

mask_image_uri
string
Optional

Mask image that controls which pixels are edited (inpainting). If unset, the entire image is edited (image-to-image).

num_images
integer[1..8]

Number of images to generate.

Default: 2
store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new MultiGenerativeEditImage({
image_uri: "https://media.substrate.run/docs-klimt-park.jpg",
mask_image_uri: "https://media.substrate.run/spiral-logo.jpeg",
prompt: "large tropical colorful bright anime birds in a dark jungle full of vines, high resolution",
num_images: 2,
store: "hosted",
})

Output

{
"outputs": [
{
"image_uri": "https://assets.substrate.run/84848484.jpg"
},
{
"image_uri": "https://assets.substrate.run/48484848.jpg"
}
]
}

Generate an image using Stable Diffusion XL.

prompt
string

Text prompt.

negative_prompt
string
Optional

Negative input prompt.

steps
integer[0..150]
Optional

Number of diffusion steps.

Default: 30
num_images
integer[1..8]

Number of images to generate.

Default: 1
store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

height
integer[256..1536]
Optional

Height of output image, in pixels.

Default: 1024
width
integer[256..1536]
Optional

Width of output image, in pixels.

Default: 1024
seeds
array[integer]
Optional

Seeds for deterministic generation. Default is a random seed.

guidance_scale
float[0..30]
Optional

Higher values adhere to the text prompt more strongly, typically at the expense of image quality.

Default: 7
TypeScript
Python

const node = new StableDiffusionXL({
prompt: "hokusai futuristic supercell spiral cloud with glowing core over turbulent ocean",
negative_prompt: "night, moon",
store: "hosted",
guidance_scale: 7,
num_images: 2,
seeds: [
3306990332671669418,
13641924104177017164
,],
})

Output

{
"outputs": [
{
"image_uri": "https://assets.substrate.run/84848484.jpg",
"seed": 3306990332671669418
},
{
"image_uri": "https://assets.substrate.run/48484848.jpg",
"seed": 13641924104177017164
}
]
}

Generate an image using Stable Diffusion XL Lightning.

prompt
string

Text prompt.

negative_prompt
string
Optional

Negative input prompt.

num_images
integer[1..8]
Optional

Number of images to generate.

Default: 1
store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

height
integer[256..1536]
Optional

Height of output image, in pixels.

Default: 1024
width
integer[256..1536]
Optional

Width of output image, in pixels.

Default: 1024
seeds
array[integer]
Optional

Seeds for deterministic generation. Default is a random seed.

TypeScript
Python

const node = new StableDiffusionXLLightning({
prompt: "hokusai futuristic supercell spiral cloud with glowing core over turbulent ocean",
negative_prompt: "night, moon",
num_images: 2,
seeds: [
3306990332671669418,
13641924104177017164
,],
store: "hosted",
})

Output

{
"outputs": [
{
"image_uri": "https://assets.substrate.run/84848484.jpg",
"seed": 3306990332671669418
},
{
"image_uri": "https://assets.substrate.run/48484848.jpg",
"seed": 13641924104177017164
}
]
}

Edit an image using Stable Diffusion XL. Supports inpainting (edit part of the image with a mask) and image-to-image (edit the full image).

image_uri
string

Original image.

prompt
string

Text prompt.

mask_image_uri
string
Optional

Mask image that controls which pixels are edited (inpainting). If unset, the entire image is edited (image-to-image).

num_images
integer[1..8]

Number of images to generate.

Default: 1
output_resolution
integer
Optional

Resolution of the output image, in pixels.

Default: 1024
negative_prompt
string
Optional

Negative input prompt.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

strength
float[0..1]
Optional

Controls the strength of the generation process.

Default: 0.8
seeds
array[integer]
Optional

Random noise seeds. Default is random seeds for each generation.

TypeScript
Python

const node = new StableDiffusionXLInpaint({
image_uri: "https://media.substrate.run/docs-klimt-park.jpg",
mask_image_uri: "https://media.substrate.run/spiral-logo.jpeg",
prompt: "large tropical colorful bright birds in a jungle, high resolution oil painting",
negative_prompt: "dark, cartoon, anime",
strength: 0.8,
num_images: 2,
store: "hosted",
seeds: [
16072680593433107326,
17203982922585031095
,],
})

Output

{
"outputs": [
{
"image_uri": "https://assets.substrate.run/84848484.jpg",
"seed": 16072680593433107326
},
{
"image_uri": "https://assets.substrate.run/48484848.jpg",
"seed": 17203982922585031095
}
]
}

Generate an image with generation structured by an input image, using Stable Diffusion XL with ControlNet.

image_uri
string

Input image.

Strategy to control generation using the input image.

Options: edgedepthillusion
prompt
string

Text prompt.

num_images
integer[1..8]

Number of images to generate.

Default: 1
output_resolution
integer
Optional

Resolution of the output image, in pixels.

Default: 1024
negative_prompt
string
Optional

Negative input prompt.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

conditioning_scale
float[0..1]
Optional

Controls the influence of the input image on the generated output.

Default: 0.5
seeds
array[integer]
Optional

Random noise seeds. Default is random seeds for each generation.

TypeScript
Python

const node = new StableDiffusionXLControlNet({
image_uri: "https://media.substrate.run/spiral-logo.jpeg",
prompt: "the futuristic solarpunk city of atlantis at sunset, cinematic bokeh HD",
control_method: "illusion",
conditioning_scale: 0.5,
store: "hosted",
num_images: 2,
seeds: [
16072680593433107326,
17203982922585031095
,],
})

Output

{
"outputs": [
{
"image_uri": "https://assets.substrate.run/84848484.jpg",
"seed": 16072680593433107326
},
{
"image_uri": "https://assets.substrate.run/48484848.jpg",
"seed": 17203982922585031095
}
]
}

Generate an image with an image prompt, using Stable Diffusion XL with IP-Adapter.

prompt
string

Text prompt.

Image prompt.

num_images
integer[1..8]

Number of images to generate.

Default: 1
ip_adapter_scale
float[0..1]
Optional

Controls the influence of the image prompt on the generated output.

Default: 0.5
negative_prompt
string
Optional

Negative input prompt.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

width
integer[640..1536]
Optional

Width of output image, in pixels.

Default: 1024
height
integer[640..1536]
Optional

Height of output image, in pixels.

Default: 1024
seeds
array[integer]
Optional

Random noise seeds. Default is random seeds for each generation.

TypeScript
Python

const node = new StableDiffusionXLIPAdapter({
prompt: "woodblock wave at sunset",
negative_prompt: "low quality, low resolution",
image_prompt_uri: "https://guides.substrate.run/hokusai.jpeg",
store: "hosted",
num_images: 2,
ip_adapter_scale: 0.9,
seeds: [
6565750906821527600,
9762512681041688800
,],
})

Output

{
"outputs": [
{
"image_uri": "https://assets.substrate.run/84848484.jpg",
"seed": 6565750906821527600
},
{
"image_uri": "https://assets.substrate.run/48484848.jpg",
"seed": 9762512681041688800
}
]
}

Generate embedding for a text document.

text
string

Text to embed.

collection_name
string
Optional

Vector store name.

metadata
object
Optional

Metadata that can be used to query the vector store. Ignored if collection_name is unset.

embedded_metadata_keys
array[string]
Optional

Choose keys from metadata to embed with text.

doc_id
string
Optional

Vector store document ID. Ignored if store is unset.

model
string
Optional

Selected embedding model.

Options: jina-v2clip
Default: jina-v2
TypeScript
Python

const node = new EmbedText({
text: "Argon is the third most abundant gas in Earth's atmosphere, at 0.934% (9340 ppmv). It is more than twice as abundant as water vapor.",
model: "jina-v2",
collection_name: "smoke_tests",
metadata: {
group: "18",
},
embedded_metadata_keys: [
"group"
,],
})

Output

{
"embedding": {
"vector": [
-0.035030052065849304,
-0.04128379374742508,
0.05782046541571617
],
"doc_id": "c9de81fb98804ce0afb2b8ac17c0799b",
"metadata": {
"group": "18",
"doc_id": "c9de81fb98804ce0afb2b8ac17c0799b",
"doc": "group: 18\n\nArgon is the third most abundant gas in Earth's atmosphere, at 0.934% (9340 ppmv). It is more than twice as abundant as water vapor."
}
}
}

Generate embeddings for multiple text documents.

items
array[EmbedTextItem]

Items to embed.

text
string

Text to embed.

metadata
object
Optional

Metadata that can be used to query the vector store. Ignored if collection_name is unset.

doc_id
string
Optional

Vector store document ID. Ignored if collection_name is unset.

collection_name
string
Optional

Vector store name.

embedded_metadata_keys
array[string]
Optional

Choose keys from metadata to embed with text.

model
string
Optional

Selected embedding model.

Options: jina-v2clip
Default: jina-v2
TypeScript
Python

const node = new MultiEmbedText({
model: "jina-v2",
items: [
{
text: "Osmium is the densest naturally occurring element. When experimentally measured using X-ray crystallography, it has a density of 22.59 g/cm3. Manufacturers use its alloys with platinum, iridium, and other platinum-group metals to make fountain pen nib tipping, electrical contacts, and in other applications that require extreme durability and hardness.",
metadata: {
group: "8",
},
},
{
text: "Despite its abundant presence in the universe and Solar System—ranking fifth in cosmic abundance following hydrogen, helium, oxygen, and carbon—neon is comparatively scarce on Earth.",
metadata: {
group: "18",
},
}
,],
collection_name: "smoke_tests",
embedded_metadata_keys: [
"group"
,],
})

Output

{
"embeddings": [
{
"vector": [
-0.035030052065849304,
-0.04128379374742508,
0.05782046541571617
],
"doc_id": "c9de81fb98804ce0afb2b8ac17c0799b",
"metadata": {
"group": "8",
"doc_id": "c9de81fb98804ce0afb2b8ac17c0799b",
"doc": "group: 8\n\nOsmium is the densest naturally occurring element. When experimentally measured using X-ray crystallography, it has a density of 22.59 g/cm3. Manufacturers use its alloys with platinum, iridium, and other platinum-group metals to make fountain pen nib tipping, electrical contacts, and in other applications that require extreme durability and hardness."
}
},
{
"vector": [
0.0003024724137503654,
-0.025219274684786797,
-0.009984994307160378
],
"doc_id": "c4464f69c93946a896925589681d38b4",
"metadata": {
"group": "18",
"doc_id": "c4464f69c93946a896925589681d38b4",
"doc": "group: 18\n\nDespite its abundant presence in the universe and Solar System\u2014ranking fifth in cosmic abundance following hydrogen, helium, oxygen, and carbon\u2014neon is comparatively scarce on Earth."
}
}
]
}

Generate embedding for an image.

image_uri
string

Image to embed.

collection_name
string
Optional

Vector store name.

doc_id
string
Optional

Vector store document ID. Ignored if collection_name is unset.

model
string
Optional

Selected embedding model.

Default: clip
TypeScript
Python

const node = new EmbedImage({
image_uri: "https://media.substrate.run/docs-fuji-red.jpg",
collection_name: "smoke_tests",
})

Output

{
"embedding": {
"vector": [
0.0003024724137503654,
-0.025219274684786797,
-0.009984994307160378
],
"doc_id": "c4464f69c93946a896925589681d38b4"
}
}

Generate embeddings for multiple images.

items
array[EmbedImageItem]

Items to embed.

image_uri
string

Image to embed.

doc_id
string
Optional

Vector store document ID. Ignored if collection_name is unset.

collection_name
string
Optional

Vector store name.

model
string
Optional

Selected embedding model.

Default: clip
TypeScript
Python

const node = new MultiEmbedImage({
items: [
{
image_uri: "https://media.substrate.run/docs-fuji-red.jpg",
},
{
image_uri: "https://media.substrate.run/docs-fuji-blue.jpg",
}
,],
collection_name: "smoke_tests",
})

Output

{
"embeddings": [
{
"vector": [
-0.035030052065849304,
-0.04128379374742508,
0.05782046541571617
],
"doc_id": "c9de81fb98804ce0afb2b8ac17c0799b"
},
{
"vector": [
0.0003024724137503654,
-0.025219274684786797,
-0.009984994307160378
],
"doc_id": "c4464f69c93946a896925589681d38b4"
}
]
}

Generate embeddings for multiple text documents using Jina Embeddings 2.

items
array[EmbedTextItem]

Items to embed.

text
string

Text to embed.

metadata
object
Optional

Metadata that can be used to query the vector store. Ignored if collection_name is unset.

doc_id
string
Optional

Vector store document ID. Ignored if collection_name is unset.

collection_name
string
Optional

Vector store name.

embedded_metadata_keys
array[string]
Optional

Choose keys from metadata to embed with text.

TypeScript
Python

const node = new JinaV2({
items: [
{
text: "Hassium is a superheavy element; it has been produced in a laboratory only in very small quantities by fusing heavy nuclei with lighter ones. Natural occurrences of the element have been hypothesised but never found.",
metadata: {
group: "8",
},
},
{
text: "Xenon is also used to search for hypothetical weakly interacting massive particles and as a propellant for ion thrusters in spacecraft.",
metadata: {
group: "18",
},
}
,],
collection_name: "smoke_tests",
embedded_metadata_keys: [
"group"
,],
})

Output

{
"embeddings": [
{
"vector": [
-0.035030052065849304,
-0.04128379374742508,
0.05782046541571617
],
"doc_id": "c9de81fb98804ce0afb2b8ac17c0799b",
"metadata": {
"group": "8",
"doc_id": "c9de81fb98804ce0afb2b8ac17c0799b",
"doc": "group: 8\n\nHassium is a superheavy element; it has been produced in a laboratory only in very small quantities by fusing heavy nuclei with lighter ones. Natural occurrences of the element have been hypothesised but never found."
}
},
{
"vector": [
0.0003024724137503654,
-0.025219274684786797,
-0.009984994307160378
],
"doc_id": "c4464f69c93946a896925589681d38b4",
"metadata": {
"group": "18",
"doc_id": "c4464f69c93946a896925589681d38b4",
"doc": "group: 18\n\nXenon is also used to search for hypothetical weakly interacting massive particles and as a propellant for ion thrusters in spacecraft."
}
}
]
}

Generate embeddings for text or images using CLIP.

items
array[EmbedTextOrImageItem]

Items to embed.

image_uri
string
Optional

Image to embed.

text
string
Optional

Text to embed.

metadata
object
Optional

Metadata that can be used to query the vector store. Ignored if collection_name is unset.

doc_id
string
Optional

Vector store document ID. Ignored if collection_name is unset.

collection_name
string
Optional

Vector store name.

embedded_metadata_keys
array[string]
Optional

Choose keys from metadata to embed with text. Only applies to text items.

TypeScript
Python

const node = new CLIP({
items: [
{
image_uri: "https://media.substrate.run/docs-fuji-red.jpg",
},
{
image_uri: "https://media.substrate.run/docs-fuji-blue.jpg",
}
,],
collection_name: "smoke_tests",
})

Output

{
"embeddings": [
{
"vector": [
-0.035030052065849304,
-0.04128379374742508,
0.05782046541571617
],
"doc_id": "c9de81fb98804ce0afb2b8ac17c0799b"
},
{
"vector": [
0.0003024724137503654,
-0.025219274684786797,
-0.009984994307160378
],
"doc_id": "c4464f69c93946a896925589681d38b4"
}
]
}

Create a vector store for storing and querying embeddings.

Vector store name.

model
string

Selected embedding model.

Options: jina-v2clip
m
integer[1..64]
Optional

The max number of connections per layer for the index.

Default: 16
ef_construction
integer[1..128]
Optional

The size of the dynamic candidate list for constructing the index graph.

Default: 64
metric
string
Optional

The distance metric to construct the index with.

Options: cosinel2inner
Default: inner
TypeScript
Python

const node = new CreateVectorStore({
collection_name: "smoke_tests",
model: "jina-v2",
})

Output

{
"collection_name": "smoke_tests",
"model": "jina-v2",
"m": 16,
"ef_construction": 64,
"metric": "inner"
}

List all vector stores.

TypeScript
Python

const node = new ListVectorStores({
})

Output

{
"items": [
{
"collection_name": "comments",
"model": "jina-v2",
"m": 16,
"ef_construction": 64,
"metric": "inner"
},
{
"collection_name": "images",
"model": "jina-v2",
"m": 16,
"ef_construction": 64,
"metric": "inner"
}
]
}

Delete a vector store.

Vector store name.

model
string

Selected embedding model.

Options: jina-v2clip
TypeScript
Python

const node = new DeleteVectorStore({
collection_name: "fake_store",
model: "jina-v2",
})

Output

{
"collection_name": "comments",
"model": "jina-v2"
}

Query a vector store for similar vectors.

Vector store to query against.

model
string

Selected embedding model.

Options: jina-v2clip
query_strings
array[string]
Optional

Texts to embed and use for the query.

query_image_uris
array[string]
Optional

Image URIs to embed and use for the query.

query_vectors
array[array]
Optional

Vectors to use for the query.

query_ids
array[string]
Optional

Document IDs to use for the query.

top_k
integer[1..1000]
Optional

Number of results to return.

Default: 10
ef_search
integer[1..1000]
Optional

The size of the dynamic candidate list for searching the index graph.

Default: 40
include_values
boolean
Optional

Include the values of the vectors in the response.

Default: false
include_metadata
boolean
Optional

Include the metadata of the vectors in the response.

Default: false
filters
object
Optional

Filter metadata by key-value pairs.

TypeScript
Python

const node = new QueryVectorStore({
collection_name: "smoke_tests",
model: "jina-v2",
query_strings: [
"gas",
"metal"
,],
top_k: 1,
include_metadata: true,
})

Output

{
"results": [
[
{
"id": "483e75021c9d4ad69c3d78ace76da2ea",
"distance": -0.78324556350708,
"metadata": {
"doc": "group: 18\n\nArgon is the third most abundant gas in Earth's atmosphere, at 0.934% (9340 ppmv). It is more than twice as abundant as water vapor.",
"group": "18",
"doc_id": "483e75021c9d4ad69c3d78ace76da2ea"
}
}
],
[
{
"id": "dd8f3774e05d42caa53cfbaa7389c08f",
"distance": -0.74278724193573,
"metadata": {
"doc": "group: 8\n\nOsmium is the densest naturally occurring element. When experimentally measured using X-ray crystallography, it has a density of 22.59 g/cm3. Manufacturers use its alloys with platinum, iridium, and other platinum-group metals to make fountain pen nib tipping, electrical contacts, and in other applications that require extreme durability and hardness.",
"group": "8",
"doc_id": "dd8f3774e05d42caa53cfbaa7389c08f"
}
}
]
],
"collection_name": "comments",
"model": "jina-v2",
"metric": "inner"
}

Fetch vectors from a vector store.

Vector store name.

model
string

Selected embedding model.

Options: jina-v2clip
ids
array[string]

Document IDs to retrieve.

TypeScript
Python

const node = new FetchVectors({
collection_name: "smoke_tests",
model: "jina-v2",
ids: [
"dd8f3774e05d42caa53cfbaa7389c08f"
,],
})

Output

{
"vectors": [
{
"id": "dd8f3774e05d42caa53cfbaa7389c08f",
"vector": [
0.036658343,
-0.0066040196,
0.028221145
],
"metadata": {
"doc": "group: 8\n\nOsmium is the densest naturally occurring element. When experimentally measured using X-ray crystallography, it has a density of 22.59 g/cm3. Manufacturers use its alloys with platinum, iridium, and other platinum-group metals to make fountain pen nib tipping, electrical contacts, and in other applications that require extreme durability and hardness.",
"group": "8",
"doc_id": "dd8f3774e05d42caa53cfbaa7389c08f"
}
}
]
}

Update vectors in a vector store.

Vector store name.

model
string

Selected embedding model.

Options: jina-v2clip
vectors
array[UpdateVectorParams]

Vectors to upsert.

id
string

Document ID.

vector
array[number]
Optional

Embedding vector.

metadata
object
Optional

Document metadata.

TypeScript
Python

const node = new UpdateVectors({
collection_name: "smoke_tests",
model: "jina-v2",
vectors: [
{
id: "dd8f3774e05d42caa53cfbaa7389c08f",
metadata: {
appearance: "silvery, blue cast",
},
}
,],
})

Output

{
"count": 1
}

Delete vectors in a vector store.

Vector store name.

model
string

Selected embedding model.

Options: jina-v2clip
ids
array[string]

Document IDs to delete.

TypeScript
Python

const node = new DeleteVectors({
collection_name: "smoke_tests",
model: "jina-v2",
ids: [
"ac32b9a133dd4e3689004f6e8f0fd6cd",
"629df177c7644062a68bceeff223cefa"
,],
})

Output

{
"count": 2
}

Transcribe speech in an audio or video file.

audio_uri
string

Input audio.

prompt
string
Optional

Prompt to guide model on the content and context of input audio.

language
string
Optional

Language of input audio in ISO-639-1 format.

Default: en
segment
boolean
Optional

Segment the text into sentences with approximate timestamps.

Default: false
align
boolean
Optional

Align transcription to produce more accurate sentence-level timestamps and word-level timestamps. An array of word segments will be included in each sentence segment.

Default: false
diarize
boolean
Optional

Identify speakers for each segment. Speaker IDs will be included in each segment.

Default: false
suggest_chapters
boolean
Optional

Suggest automatic chapter markers.

Default: false
TypeScript
Python

const node = new TranscribeMedia({
audio_uri: "https://media.substrate.run/dfw-clip.m4a",
prompt: "David Foster Wallace interviewed about US culture, and Infinite Jest",
segment: true,
align: true,
diarize: true,
suggest_chapters: true,
})

Output

{
"text": "language like that, the wounded inner child, the inner pain, is part of a kind of pop psychological movement in the United States that is a sort of popular Freudianism that ...",
"segments": [
{
"start": 0.874,
"end": 15.353,
"speaker": "SPEAKER_00",
"text": "language like that, the wounded inner child, the inner pain, is part of a kind of pop psychological movement in the United States that is a sort of popular Freudianism that",
"words": [
{
"word": "language",
"start": 0.874,
"end": 1.275,
"speaker": "SPEAKER_00"
},
{
"word": "like",
"start": 1.295,
"end": 1.455,
"speaker": "SPEAKER_00"
}
]
}
],
"chapters": [
{
"title": "Introduction to the Wounded Inner Child and Popular Psychology in US",
"start": 0.794
},
{
"title": "The Paradox of Popular Psychology and Anger in America",
"start": 16.186
}
]
}

Generate speech from text.

text
string

Input text.

store
string
Optional

Use "hosted" to return an audio URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the audio data will be returned as a base64-encoded string.

TypeScript
Python

const node = new GenerateSpeech({
text: "Substrate: an underlying substance or layer.",
store: "hosted",
})

Output

{
"audio_uri": "https://assets.substrate.run/84848484.wav"
}

Generate speech from text using XTTS v2.

text
string

Input text.

audio_uri
string
Optional

Reference audio used to synthesize the speaker. If unset, a default speaker voice will be used.

language
string
Optional

Language of input text. Supported languages: en, de, fr, es, it, pt, pl, zh, ar, cs, ru, nl, tr, hu, ko.

Default: en
store
string
Optional

Use "hosted" to return an audio URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the audio data will be returned as a base64-encoded string.

TypeScript
Python

const node = new XTTSV2({
text: "Substrate: an underlying substance or layer.",
audio_uri: "https://media.substrate.run/docs-speaker.wav",
store: "hosted",
})

Output

{
"audio_uri": "https://assets.substrate.run/84848484.wav"
}

Remove the background from an image, with the option to return the foreground as a mask.

image_uri
string

Input image.

return_mask
boolean
Optional

Return a mask image instead of the original content.

Default: false
background_color
string
Optional

Hex value background color. Transparent if unset.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new RemoveBackground({
image_uri: "https://media.substrate.run/apple-forest.jpeg",
store: "hosted",
})

Output

{
"image_uri": "https://assets.substrate.run/84848484.jpg"
}

Fill (inpaint) part of an image, e.g. to 'remove' an object.

image_uri
string

Input image.

Mask image that controls which pixels are inpainted.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new FillMask({
image_uri: "https://media.substrate.run/apple-forest.jpeg",
mask_image_uri: "https://media.substrate.run/apple-forest-mask.jpeg",
store: "hosted",
})

Output

{
"image_uri": "https://assets.substrate.run/84848484.jpg"
}

Upscale an image.

image_uri
string

Input image.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new UpscaleImage({
image_uri: "https://media.substrate.run/docs-seurat.jpg",
store: "hosted",
})

Output

{
"image_uri": "https://assets.substrate.run/84848484.jpg"
}

Segment an image under a point and return the segment.

image_uri
string

Input image.

point
Point

Point prompt.

x
integer

X position.

y
integer

Y position.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new SegmentUnderPoint({
image_uri: "https://media.substrate.run/docs-vg-bedroom.jpg",
point: {
x: 189,
y: 537,
},
store: "hosted",
})

Output

{
"mask_image_uri": "https://assets.substrate.run/84848484.jpg"
}

Segment an image using SegmentAnything.

image_uri
string

Input image.

point_prompts
array[Point]
Optional

Point prompts, to detect a segment under the point. One of point_prompts or box_prompts must be set.

x
integer

X position.

y
integer

Y position.

box_prompts
array[BoundingBox]
Optional

Box prompts, to detect a segment within the bounding box. One of point_prompts or box_prompts must be set.

x1
float

Top left corner x.

y1
float

Top left corner y.

x2
float

Bottom right corner x.

y2
float

Bottom right corner y.

store
string
Optional

Use "hosted" to return an image URL hosted on Substrate. You can also provide a URL to a registered file store. If unset, the image data will be returned as a base64-encoded string.

TypeScript
Python

const node = new SegmentAnything({
image_uri: "https://media.substrate.run/docs-vg-bedroom.jpg",
point_prompts: [
{
x: 189,
y: 537,
}
,],
store: "hosted",
})

Output

{
"mask_image_uri": "https://assets.substrate.run/84848484.jpg"
}

Evaluate code using a code interpreter.

code
string

Code to execute.

args
array[string]
Optional

List of command line arguments.

language
string
Optional

Interpreter to use.

Options: pythontypescriptjavascript
Default: python
TypeScript
Python

const node = new RunCode({
code: `import json\nimport sys\nprint(json.dumps({'foo':sys.argv[1]}))`,
args: [
"bar"
,],
language: "python",
})

Output

{
"output": "{\"foo\": \"bar\"}\n",
"json_output": {
"foo": "bar"
}
}