COPY FROM
View as MarkdownCOPY FROM copies data into a table using the Postgres COPY protocol.
Syntax
COPY [INTO] <table_name> [ ( <column> [, ...] ) ] FROM STDIN
[[WITH] ( <option1> [=] <val1> [, ...] ] )]
;
| Syntax element | Description | ||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
<table_name>
|
Name of an existing table to copy data into. | ||||||||||||||
( <column> [, ...] )
|
If specified, correlate the inserted rows’ columns to <table_name>’s columns by ordinal position, i.e. the first column of the row to insert is correlated to the first named column. If not specified, all columns must have data provided, and will be referenced using their order in the table. With a partial column list, all unreferenced columns will receive their default value.
|
||||||||||||||
[WITH] ( <option1> [=] <val1> [, ...] )
|
The following
|
COPY [INTO] <table_name> [ ( <column> [, ...] ) ] FROM [<s3 URI> | <http URL>]
[[WITH] ( <option1> [=] <val1> [, ...] ] )]
;
| Syntax element | Description | ||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
<table_name>
|
Name of an existing table to copy data into. | ||||||||||||||||||||||||||||||||||||||||
( <column> [, ...] )
|
If specified, correlate the inserted rows’ columns to <table_name>’s columns by ordinal position, i.e. the first column of the row to insert is correlated to the first named column. If not specified, all columns must have data provided, and will be referenced using their order in the table. With a partial column list, all unreferenced columns will receive their default value.
|
||||||||||||||||||||||||||||||||||||||||
<s3 URI>
|
The unique resource identifier (URI) of the Amazon S3 bucket (and prefix) to retrieve the file(s) to be copied from. If using an s3 URI, an AWS connection must be provided in the WITH clause.
|
||||||||||||||||||||||||||||||||||||||||
<HTTP URL>
|
The URL (for example, s3 presigned URL) to retrieve the file(s) to be copied from. | ||||||||||||||||||||||||||||||||||||||||
[WITH] ( <option1> [=] <val1> [, ...] )
|
The following
Note that |
Details
S3 Bucket IAM Policies
To use COPY FROM with S3, you need to allow the following actions in your IAM policy:
| Action type | Action name | Action description |
|---|---|---|
| Read | s3:GetObject |
Grants permission to retrieve an object from a bucket. |
| List | s3:ListBucket |
Grants permission to list some or all of the objects in a bucket. |
Text formatting
As described in the Text Format section of PostgreSQL’s documentation.
CSV formatting
As described in the CSV Format section of PostgreSQL’s documentation except that:
-
More than one layer of escaped quote characters returns the wrong result.
-
Quote characters must immediately follow a delimiter to be treated as expected.
-
Single-column rows containing quoted end-of-data markers (e.g.
"\.") will be treated as end-of-data markers despite being quoted. In PostgreSQL, this data would be escaped and would not terminate the data processing. -
Quoted null strings will be parsed as nulls, despite being quoted. In PostgreSQL, this data would be escaped.
To ensure proper null handling, we recommend specifying a unique string for null values, and ensuring it is never quoted.
-
Unterminated quotes are allowed, i.e. they do not generate errors. In PostgreSQL, all open unescaped quotation punctuation must have a matching piece of unescaped quotation punctuation or it generates an error.
Limits
You can copy up to 10 GiB of data at a time. If you need to copy more than that, please contact support.
Examples
From STDIN
COPY t FROM STDIN WITH (DELIMITER '|');
COPY t FROM STDIN (FORMAT CSV);
COPY t FROM STDIN (DELIMITER '|');
From AWS S3
Using AWS connection
Perform bulk import:
Using FILES option:
COPY INTO csv_table FROM 's3://example_bucket' (FORMAT CSV, AWS CONNECTION = example_aws_conn, FILES = ['example_data.csv']);
Using the full s3 URI:
COPY INTO csv_table FROM 's3://example_bucket/example_data.csv' (FORMAT CSV, AWS CONNECTION = example_aws_conn);
Using S3-compatible object storage
You can use COPY FROM with any S3-compatible object storage service, such as
Google Cloud Storage, Cloudflare R2, or MinIO. First,
create an AWS connection for S3-compatible storage,
then use it in the COPY command. Make sure your credentials have the necessary
permissions as described in S3 Bucket IAM Policies.
COPY INTO csv_table FROM 's3://my_bucket/my_data.csv' (FORMAT CSV, AWS CONNECTION = gcs_connection);
Using presigned URL
COPY INTO csv_table FROM '<s3 presigned URL>' (FORMAT CSV);
Privileges
The privileges required to execute this statement are:
USAGEprivileges on the schema containing the table.INSERTprivileges on the table.