Syncplicity DataHub 4 Save PDF Selected topic Selected topic and subtopics All content Directory of Syncplicity documentation Duplicate, clone a job There are a number of parameters within a job that can only be set when the job is first created (i.e. source/destination connections and paths). The ability to duplicate a job allows you to copy a job and override parameters of the "new" job. Usage Jobs can be duplicated through the ReST API. The request can optionally include a JSON body that overrides the parameters within the source job. The request must include an appropriate Authorization header with a valid access token. An example request and response is found below. Clone Job POST <server-url>/v1/jobs/<job_id>/clone Content-Type: application/json Clone Job | Request & Response Body Request Body{ "name": "Duplicated Job", "schedule": { "mode": "manual" }} Server Response{ "status": 201, "type": "job", "job": { "kind": "transfer", "id": "d5ae28772a6b4e50a8fea09df4fc2ceb", "name": "Duplicated Job", "disabled": false, "type": "job", "links": { "self": { "href": "http://localhost:9090/v1/jobs/d5ae28772a6b4e50a8fea09df4fc2ceb" } } }} Delete or Disable Original Job When cloning a job, you want to ensure you either delete the original job or at least remove the schedule so you do not encounter transfer conflicts. Deleted jobs are still available in the database and can be queried; the job history is NOT deleted. DataHub simply sets the flag to delete so it will be removed from REST API responses. When you delete a job, DataHub will flag it and there is a check every 5 seconds or so on each node which cancels the job if it is running. Delete Original Job DELETE {{url}}v1/jobs/{{original_job_id}} Modify Job Schedule PATCH {{url}}v1/jobs/{{original_job_id}} Request Body{ "kind": "transfer", "schedule": { "mode": "manual" }} Related: Scheduling a Job Related Links
Directory of Syncplicity documentation Duplicate, clone a job There are a number of parameters within a job that can only be set when the job is first created (i.e. source/destination connections and paths). The ability to duplicate a job allows you to copy a job and override parameters of the "new" job. Usage Jobs can be duplicated through the ReST API. The request can optionally include a JSON body that overrides the parameters within the source job. The request must include an appropriate Authorization header with a valid access token. An example request and response is found below. Clone Job POST <server-url>/v1/jobs/<job_id>/clone Content-Type: application/json Clone Job | Request & Response Body Request Body{ "name": "Duplicated Job", "schedule": { "mode": "manual" }} Server Response{ "status": 201, "type": "job", "job": { "kind": "transfer", "id": "d5ae28772a6b4e50a8fea09df4fc2ceb", "name": "Duplicated Job", "disabled": false, "type": "job", "links": { "self": { "href": "http://localhost:9090/v1/jobs/d5ae28772a6b4e50a8fea09df4fc2ceb" } } }} Delete or Disable Original Job When cloning a job, you want to ensure you either delete the original job or at least remove the schedule so you do not encounter transfer conflicts. Deleted jobs are still available in the database and can be queried; the job history is NOT deleted. DataHub simply sets the flag to delete so it will be removed from REST API responses. When you delete a job, DataHub will flag it and there is a check every 5 seconds or so on each node which cancels the job if it is running. Delete Original Job DELETE {{url}}v1/jobs/{{original_job_id}} Modify Job Schedule PATCH {{url}}v1/jobs/{{original_job_id}} Request Body{ "kind": "transfer", "schedule": { "mode": "manual" }} Related: Scheduling a Job