Import CSV data into your Parse Server database

  • 1
    Launch the ETL Designer

    Login to NodeChef. From the NodeChef Task Manager, click on DB actions > CSV / JSON import to launch the designer

  • 2
    Choose the file to import
    • If you are uploading the data, click on the Local file system option and then choose the file from your file system. The file cannot exceed 192 megabytes in size but you can however compress the file using gzip or bzip before uploading. This will allow you to bypass this limitation.
    • You can also specify the HTTP(s) or FTP(s) uri to the file. The ETL engine will automatically download the file and import the data. The Content Length of the response from the server hosting the file cannot exceed 192 megabytes in size. You can compress the file on the server using gzip or bzip.
      If the file is hosted on S3 or GCS, you will have to enable direct access for the file or enable anonymous users to access the file before providing the link to the file.
  • 3
    Select input file compression method

    If you compressed the file you are attempting to import, you can select the compression method used. Currently we support gzip or bzip

  • 4
    Specify the encoding of the file
    If the file you are importing has the byte order mark (BOM) at the beginning, you can simply ignore this step. Else you must specify the right encoding of the file.

    We currently support ASCII, ISO/IEC 8859, UTF-8, UTF-16 (Little endian), UTF-16 (Big endian).

    If you do not know the character encoding, you can select UTF-8 because of its high frequency of usage and preview the data. It is very less likely the character encoding of your data file is UTF-16. This option is only provided for special use cases.

  • 5
    Select the file format

    If the file you are importing is a CSV file, you can select the CSV option. Under the options, if the first row in the CSV file contains the column name, you must select the "Consider first row as column headers" option.

    In some cases the file you are importing is not delimetered by a comma but instead a tab and other characters. For this use case, select Flatfile instead. You can then enter the column and row delimeter under options.

  • 6
    Provide the name of the class in which you want to insert this data.

    If the class does not exist in the target database, the class will be created. Class names are case sensitive. The update object in database if _id exist option result in the slowest operation as bulk update operations cannot be performed in this case. Each object has to be updated individually.

    Under the various options select the class is a Parse Server class option and other applicable options.

    The attributes _created_at and _updated_at will be automatically added to each row if they do not exist. Also, a Parse Server compatible object id (10 characters long) is generated for your rows. The ETL engine can infer the schema of the CSV you are importing if you select the "Use the Parse Server schema" option. However, in some cases you might want more control over the data types that you are importing and also perform some custom transformations.

    If you select the Use the Parse Server schema option, only columns currently created on the class will be imported. All other columns in the input file will be ignored.

    • To import a column from the flatfile as an array, the values must be delimetered with the character "~". The data import engine will automatically split the column by the character "~" and create an array dynamically. This applies only when you select the Use Parse Server Schema Option. If not you must handle this directly with your own transformation logic.
    • To import a geopoint, use the below format. Once the ETL engine detects the column is a geopoint type after reading the _SCHEMA, it will convert the value into a valid geopoint for the Parse Server. This applies only when you select the Use Parse Server Schema Option. If not you must handle this directly with your own transformation logic.
  • 7
    Preview and Import

    Use the preview button to preview the data to be inserted, Up to 32 rows will be showed for the preview. Once satisfied with the preview, you can use the import button to import the data.