RethinkDB Import

rethinkdb import --help

 

‘rethinkdb import` loads data into a RethinkDB cluster
rethinkdb import -d DIR [-c HOST:PORT] [-a AUTH_KEY] [–force]
[-i (DB | DB.TABLE)] [–clients NUM]
rethinkdb import -f FILE –table DB.TABLE [-c HOST:PORT] [-a AUTH_KEY]
[–force] [–clients NUM] [–format (csv | json)] [–pkey PRIMARY_KEY]
[–delimiter CHARACTER] [–custom-header FIELD,FIELD… [–no-header]]

-h [ –help ] print this help
-c [ –connect ] HOST:PORT host and client port of a rethinkdb node to connect
to (defaults to localhost:28015)
-a [ –auth ] AUTH_KEY authorization key for rethinkdb clients
–clients NUM_CLIENTS the number of client connections to use (defaults
to 8)
–hard-durability use hard durability writes (slower, but less memory
consumption on the server)
–force import data even if a table already exists, and
overwrite duplicate primary keys
–fields limit which fields to use when importing one table

Import directory:
-d [ –directory ] DIR the directory to import data from
-i [ –import ] (DB | DB.TABLE) limit restore to the given database or table (may
be specified multiple times)

Import file:
-f [ –file ] FILE the file to import data from
–table DB.TABLE the table to import the data into
–format (csv | json) the format of the file (defaults to json)
–pkey PRIMARY_KEY the field to use as the primary key in the table

Import CSV format:
–delimiter CHARACTER character separating fields, or ‘\t’ for tab
–no-header do not read in a header of field names
–custom-header FIELD,FIELD… header to use (overriding file header), must be
specified if –no-header

EXAMPLES:

rethinkdb import -d rdb_export -c mnemosyne:39500 –clients 128
Import data into a cluster running on host ‘mnemosyne’ with a client port at 39500,
using 128 client connections and the named export directory.

rethinkdb import -f site_history.csv –format csv –table test.history –pkey count
Import data into a local cluster and the table ‘history’ in the ‘test’ database,
using the named CSV file, and using the ‘count’ field as the primary key.

rethinkdb import -d rdb_export -c hades -a hunter2 -i test
Import data into a cluster running on host ‘hades’ which requires authorization,
using only the database ‘test’ from the named export directory.

rethinkdb import -f subscriber_info.json –fields id,name,hashtag –force
Import data into a local cluster using the named JSON file, and only the fields
‘id’, ‘name’, and ‘hashtag’, overwriting any existing rows with the same primary key.

rethinkdb import -f user_data.csv –delimiter ‘;’ –no-header –custom-header id,name,number
Import data into a local cluster using the named CSV file with no header and instead
use the fields ‘id’, ‘name’, and ‘number’, the delimiter is a semicolon (rather than
a comma).

About Neil Rubens

see http://ActiveIntelligence.org

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

*