Actor

petr_cermak/MySQL-insert

  • Builds
  • latest 0.0.16 / 2018-09-06
  • Created 2017-07-11
  • Last modified 2018-09-12
  • grade 11

Description

This act takes a crawler execution and inserts it's results into a remote MySQL database.


API

To run the actor, send a HTTP POST request to:

https://api.apify.com/v2/acts/petr_cermak~MySQL-insert/runs?token=<YOUR_API_TOKEN>

The POST payload will be passed as input for the actor. For more information, read the docs.


Example input

Content type: application/json

{
    "_id": "YOUR_EXECUTION_ID",
    "data": {
        "connection": {
           "host"            : "DB_HOSTNAME",
           "user"            : "DB_USERNAME",
           "password"        : "DB_PASSFORD",
           "database"        : "DB_DATABASE"
        },
        "table": "DB_TABLE_NAME"
    }
}

Readme

act-mysql-insert

Apify act for inserting crawler results into a remote MySQL table.

This act fetches all results from a specified Apifier crawler execution and inserts them into a table in a remote MySQL database.

The act does not store its state, i.e. if it crashes it restarts fetching all the results. Therefore you should only use it for executions with low number of results.

INPUT

Input is a JSON object with the following properties:

{
    // crawler executionID
    "_id": "your_execution_id",

    // MySQL connection credentials
    "data": {
        "connection": {
          "host"      : "host_name",
          "user"      : "user_name",
          "password"  : "user_password",
          "database"  : "database_name"
        },
        "table": "table_name"
    }
}

The act can be run with a crawler finish webhook, in such case fill just the contents of data attribute into a crawler finish webhook data.

Additionally to crawler results, it is also possible to specify a dataset id, to fetch the result from a dataset.

{
    // id of dataset to fetch rows from
    "datasetId": "dataset_id",

    // MySQL connection credentials
    "data": "connection_credentials"
}

Alternatively you can directly specify the rows to be inserted (i.e. not fetching them from crawler execution).

{
    // rows to be inserted
    "rows": [
        {"column_1": "value_1", "column_2": "value_2"},
        {"column_1": "value_3", "column_2": "value_4"},
        ...
    ],

    // MySQL connection credentials
    "data": "connection_credentials"
}