Spring Boot, along with Spring Batch, provides an excellent foundation for writing batch jobs in Java. In this post, we'll walk through creating a job runner that initiates a process to export a CSV file of Books from a database, all triggered by an HTTP POST request. We'll also build a mechanism to retrieve a list of jobs and the resulting files.
Prerequisites
To follow this guide, you should have a basic understanding of Spring Boot and familiarity with Spring Batch. You'll also need a Spring Boot project set up and connected to a database.
Defining Our Spring Batch Job
Let's start by defining our job. We'll create a simple job that reads from a database, processes the data, and writes it to a CSV file.
- Reader: The reader is responsible for reading data from the database. We'll use JdbcCursorItemReader for this purpose
##language-java
@Bean
public JdbcCursorItemReader<Book> reader(DataSource dataSource) {
JdbcCursorItemReader<Book> reader = new JdbcCursorItemReader<Book>();
reader.setDataSource(dataSource);
reader.setSql("SELECT id, title, author FROM book");
reader.setRowMapper(new BeanPropertyRowMapper<>(Book.class));
return reader;
}
- Processor: The processor will perform transformations on our data. For this example, we won't be performing any transformations, so our processor is a simple pass-through.
##language-java
@Bean
public ItemProcessor<Book, Book> processor() {
return (book) -> book;
}
- Writer: The writer is responsible for writing our processed data to a CSV file.
##language-java
@Bean
public FlatFileItemWriter<Book> writer() {
FlatFileItemWriter<Book> writer = new FlatFileItemWriter<>();
writer.setResource(new FileSystemResource("books.csv"));
writer.setLineAggregator(new DelimitedLineAggregator<Book>() {{
setDelimiter(",");
setFieldExtractor(new BeanWrapperFieldExtractor<Book>() {{
setNames(new String[] {"id", "title", "author"});
}});
}});
return writer;
}
- Job: The job ties our reader, processor, and writer together.
##language-java
@Bean
public Job exportBookJob(JobCompletionNotificationListener listener, Step step) {
return jobBuilderFactory.get("exportBookJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step)
.end()
.build();
}
- Step: The step defines how much data to read at once (in chunks).
##language-java
@Bean
public Step step(JdbcCursorItemReader<Book> reader,
ItemProcessor<Book, Book> processor,
FlatFileItemWriter<Book> writer) {
return stepBuilderFactory.get("step")
.<Book, Book> chunk(10)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
}
Creating an HTTP Trigger
We want to trigger our job via an HTTP POST request. We can do this by exposing a REST endpoint that launches the job. Here's a simple RestController that does just that:
##language-java
@RestController
public class JobLauncherController {
private final JobLauncher jobLauncher;
private final Job job;
@Autowired
public JobLauncherController(JobLauncher jobLauncher, Job job) {
this.jobLauncher = jobLauncher;
this.job = job;
}
@PostMapping("/run-batch-job")
public ResponseEntity<String> handle() throws Exception {
try {
JobParameters jobParameters = new Job
ParametersBuilder().addLong("time", System.currentTimeMillis()).toJobParameters();jobLauncher.run(job, jobParameters);return ResponseEntity.ok("Batch job has been invoked");} catch (Exception e) {return ResponseEntity.badRequest().body("Could not execute batch job");
}
}}
Adding Job Tracking
Next, we need a way to track our jobs and retrieve the result files. We can do this by storing the JobExecution and the filename in a database whenever a job is completed. Spring Batch provides a `JobExecutionListener` that we can use to hook into job completion events.
##language-java
public class JobCompletionNotificationListener extends JobExecutionListenerSupport {
private final JdbcTemplate jdbcTemplate;
@Autowired
public JobCompletionNotificationListener(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
@Override
public void afterJob(JobExecution jobExecution) {
String filename = "books.csv"; // Update this based on your logic
String query = "INSERT INTO job_result (job_id, filename) VALUES (?, ?)";
jdbcTemplate.update(query, jobExecution.getJobId(), filename);
}
}
To retrieve the list of jobs and result files, we can create another REST endpoint:
##language-java
@GetMapping("/jobs")
public List<JobResult> getJobs() {
return jdbcTemplate.query("SELECT * FROM job_result",
new BeanPropertyRowMapper<>(JobResult.class));
}
Conclusion
This guide walked you through creating a job runner using Spring Boot and Spring Batch. The job exports data from a database to a CSV file and is triggered by an HTTP POST request. With the ability to track jobs and retrieve result files, you now have a robust, scalable, and flexible way of managing batch jobs in your Spring Boot application.