Batch
Run AWS batch jobs without installing software or servers
Last updated
Run AWS batch jobs without installing software or servers
Last updated
© DuploCloud, Inc. All rights reserved. DuploCloud trademarks used herein are registered trademarks of DuploCloud and affiliates
You can perform AWS batch job processing directly in the DuploCloud Portal without the additional overhead of installed software, allowing you to focus on analyzing results and diagnosing problems.
Create scheduling policies to define when your batch job runs.
From the DuploCloud Portal, navigate to Cloud Services -> Batch page, and click the Scheduling Policies tab.
Click Add. The Create Batch Scheduling Policy page displays.
Create batch job scheduling policies using the AWS documentation. The fields in the AWS documentation map to the fields on the DuploCloud Create Batch Scheduling Policy page.
Click Create.
AWS compute environments (Elastic Compute Cloud [EC2] instances) map to DuploCloud Infrastructures. The settings and constraints in the computing environment define how to configure and automatically launch the instance.
In the DuploCloud Portal, navigate to Cloud Services -> Batch.
Click the Compute Environments tab.
Click Add. The Add Batch Environment page displays.
In the Compute Environment Name field, enter a unique name for your environment.
From the Type list box, select the environment type (On-Demand, Spot, Fargate, etc.).
Modify additional defaults on the page or add configuration parameters in the Other Configurations field, as needed.
Click Create. The compute environment is created.
After you define job definitions, create queues for your batch jobs to run in. For more information, see the AWS instructions for creating a job queue.
From the DuploCloud Portal, navigate to Cloud Services -> Batch page, and click the Queues tab.
Click Add. The Create Batch Queue page displays.
Create batch job queues using the AWS documentation. The fields in the AWS documentation map to the fields on the DuploCloud Create Batch Queue page.
Click Create. The batch queue is created.
In the Priority field, enter a whole number. Job queues with a higher priority number are run before those with a lower priority number in the same compute environment.
Before you can run AWS batch jobs, you need to create job definitions specifying how batch jobs are run.
From the DuploCloud Portal, navigate to Cloud Services -> Batch, and click the Job Definitions tab.
Click Add. The Create Batch Job Definition page displays.
Define your batch jobs using the AWS documentation. The fields in the AWS documentation map to the fields on the DuploCloud Create Batch Job Definition page.
Click Create. The batch job definition is created.
Add a job for AWS batch processing. See the AWS documentation for more information about batch jobs.
After you configure your compute environment, navigate to Cloud Services -> Batch and click the Jobs tab.
Click Add. The Add Batch Job page displays.
On the Add Batch Job page, fill the Job Name, Job Definition, Job Queue, and Job Properties fields.
Optionally, if you created a scheduling policy to apply to this job, paste the YAML code below into the Other Properties field.
Click Create. The batch job is created.
As you create a batch job, paste the following YAML code into the Other Properties field on the Add Batch Job page. Replace the scheduling priority override value ("1" in this example) with an integer representing the job's scheduling priority, and replace SHARE_IDENTIFIER with the job's share identifier. For more information, see the AWS documentation.
Navigate from the DuploCloud Portal to Cloud Services -> Batch, and click the Jobs tab. The jobs list displays.
Click the name of the job to view job details such as job status, ID, queue, and definition.
Use the AWS Best Practices Guide for information about running your AWS batch jobs.